Feb 13 07:51:54.551735 kernel: Linux version 5.15.148-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP Mon Feb 12 18:05:31 -00 2024 Feb 13 07:51:54.551748 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 07:51:54.551755 kernel: BIOS-provided physical RAM map: Feb 13 07:51:54.551759 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Feb 13 07:51:54.551762 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Feb 13 07:51:54.551766 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Feb 13 07:51:54.551771 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Feb 13 07:51:54.551775 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Feb 13 07:51:54.551778 kernel: BIOS-e820: [mem 0x0000000040400000-0x00000000819e2fff] usable Feb 13 07:51:54.551782 kernel: BIOS-e820: [mem 0x00000000819e3000-0x00000000819e3fff] ACPI NVS Feb 13 07:51:54.551787 kernel: BIOS-e820: [mem 0x00000000819e4000-0x00000000819e4fff] reserved Feb 13 07:51:54.551791 kernel: BIOS-e820: [mem 0x00000000819e5000-0x000000008afccfff] usable Feb 13 07:51:54.551795 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Feb 13 07:51:54.551799 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Feb 13 07:51:54.551804 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Feb 13 07:51:54.551809 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Feb 13 07:51:54.551813 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Feb 13 07:51:54.551817 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Feb 13 07:51:54.551821 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Feb 13 07:51:54.551826 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Feb 13 07:51:54.551830 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Feb 13 07:51:54.551834 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 07:51:54.551838 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Feb 13 07:51:54.551842 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Feb 13 07:51:54.551846 kernel: NX (Execute Disable) protection: active Feb 13 07:51:54.551851 kernel: SMBIOS 3.2.1 present. Feb 13 07:51:54.551856 kernel: DMI: Supermicro SYS-5019C-MR/X11SCM-F, BIOS 1.9 09/16/2022 Feb 13 07:51:54.551860 kernel: tsc: Detected 3400.000 MHz processor Feb 13 07:51:54.551864 kernel: tsc: Detected 3399.906 MHz TSC Feb 13 07:51:54.551868 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 07:51:54.551873 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 07:51:54.551878 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Feb 13 07:51:54.551882 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 07:51:54.551886 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Feb 13 07:51:54.551891 kernel: Using GB pages for direct mapping Feb 13 07:51:54.551895 kernel: ACPI: Early table checksum verification disabled Feb 13 07:51:54.551900 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Feb 13 07:51:54.551905 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Feb 13 07:51:54.551909 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Feb 13 07:51:54.551913 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Feb 13 07:51:54.551920 kernel: ACPI: FACS 0x000000008C66CF80 000040 Feb 13 07:51:54.551924 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Feb 13 07:51:54.551930 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Feb 13 07:51:54.551934 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Feb 13 07:51:54.551939 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Feb 13 07:51:54.551944 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Feb 13 07:51:54.551948 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Feb 13 07:51:54.551953 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Feb 13 07:51:54.551958 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Feb 13 07:51:54.551963 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 07:51:54.551968 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Feb 13 07:51:54.551973 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Feb 13 07:51:54.551977 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 07:51:54.551982 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 07:51:54.551987 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Feb 13 07:51:54.551991 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Feb 13 07:51:54.551996 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 07:51:54.552001 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Feb 13 07:51:54.552006 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Feb 13 07:51:54.552011 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Feb 13 07:51:54.552016 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Feb 13 07:51:54.552020 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Feb 13 07:51:54.552025 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Feb 13 07:51:54.552030 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Feb 13 07:51:54.552034 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Feb 13 07:51:54.552039 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Feb 13 07:51:54.552044 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Feb 13 07:51:54.552049 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Feb 13 07:51:54.552054 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Feb 13 07:51:54.552059 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Feb 13 07:51:54.552063 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Feb 13 07:51:54.552068 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Feb 13 07:51:54.552072 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Feb 13 07:51:54.552077 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Feb 13 07:51:54.552082 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Feb 13 07:51:54.552087 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Feb 13 07:51:54.552092 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Feb 13 07:51:54.552097 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Feb 13 07:51:54.552101 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Feb 13 07:51:54.552106 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Feb 13 07:51:54.552111 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Feb 13 07:51:54.552115 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Feb 13 07:51:54.552120 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Feb 13 07:51:54.552124 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Feb 13 07:51:54.552130 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Feb 13 07:51:54.552135 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Feb 13 07:51:54.552139 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Feb 13 07:51:54.552144 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Feb 13 07:51:54.552148 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Feb 13 07:51:54.552153 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Feb 13 07:51:54.552158 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Feb 13 07:51:54.552162 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Feb 13 07:51:54.552167 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Feb 13 07:51:54.552172 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Feb 13 07:51:54.552177 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Feb 13 07:51:54.552182 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Feb 13 07:51:54.552186 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Feb 13 07:51:54.552191 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Feb 13 07:51:54.552196 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Feb 13 07:51:54.552200 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Feb 13 07:51:54.552205 kernel: No NUMA configuration found Feb 13 07:51:54.552210 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Feb 13 07:51:54.552215 kernel: NODE_DATA(0) allocated [mem 0x86effa000-0x86effffff] Feb 13 07:51:54.552220 kernel: Zone ranges: Feb 13 07:51:54.552225 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 07:51:54.552229 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Feb 13 07:51:54.552234 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Feb 13 07:51:54.552239 kernel: Movable zone start for each node Feb 13 07:51:54.552243 kernel: Early memory node ranges Feb 13 07:51:54.552248 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Feb 13 07:51:54.552253 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Feb 13 07:51:54.552258 kernel: node 0: [mem 0x0000000040400000-0x00000000819e2fff] Feb 13 07:51:54.552263 kernel: node 0: [mem 0x00000000819e5000-0x000000008afccfff] Feb 13 07:51:54.552268 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Feb 13 07:51:54.552272 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Feb 13 07:51:54.552277 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Feb 13 07:51:54.552282 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Feb 13 07:51:54.552286 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 07:51:54.552295 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Feb 13 07:51:54.552300 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Feb 13 07:51:54.552305 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Feb 13 07:51:54.552310 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Feb 13 07:51:54.552316 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Feb 13 07:51:54.552321 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Feb 13 07:51:54.552326 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Feb 13 07:51:54.552331 kernel: ACPI: PM-Timer IO Port: 0x1808 Feb 13 07:51:54.552336 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 07:51:54.552341 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 07:51:54.552346 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 07:51:54.552352 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 07:51:54.552357 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 07:51:54.552362 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 07:51:54.552367 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 07:51:54.552372 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 07:51:54.552377 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 07:51:54.552382 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 07:51:54.552387 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 07:51:54.552392 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 07:51:54.552398 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 07:51:54.552403 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 07:51:54.552408 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 07:51:54.552412 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 07:51:54.552417 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Feb 13 07:51:54.552422 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 13 07:51:54.552427 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 13 07:51:54.552433 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 07:51:54.552438 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Feb 13 07:51:54.552443 kernel: TSC deadline timer available Feb 13 07:51:54.552448 kernel: smpboot: Allowing 16 CPUs, 0 hotplug CPUs Feb 13 07:51:54.552453 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Feb 13 07:51:54.552458 kernel: Booting paravirtualized kernel on bare hardware Feb 13 07:51:54.552464 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 07:51:54.552469 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:512 nr_cpu_ids:16 nr_node_ids:1 Feb 13 07:51:54.552474 kernel: percpu: Embedded 55 pages/cpu s185624 r8192 d31464 u262144 Feb 13 07:51:54.552479 kernel: pcpu-alloc: s185624 r8192 d31464 u262144 alloc=1*2097152 Feb 13 07:51:54.552484 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Feb 13 07:51:54.552489 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8232415 Feb 13 07:51:54.552494 kernel: Policy zone: Normal Feb 13 07:51:54.552500 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 07:51:54.552505 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 07:51:54.552510 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Feb 13 07:51:54.552515 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Feb 13 07:51:54.552520 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 07:51:54.552526 kernel: Memory: 32724720K/33452980K available (12294K kernel code, 2275K rwdata, 13700K rodata, 45496K init, 4048K bss, 728000K reserved, 0K cma-reserved) Feb 13 07:51:54.552531 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Feb 13 07:51:54.552536 kernel: ftrace: allocating 34475 entries in 135 pages Feb 13 07:51:54.552542 kernel: ftrace: allocated 135 pages with 4 groups Feb 13 07:51:54.552547 kernel: rcu: Hierarchical RCU implementation. Feb 13 07:51:54.552552 kernel: rcu: RCU event tracing is enabled. Feb 13 07:51:54.552557 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Feb 13 07:51:54.552562 kernel: Rude variant of Tasks RCU enabled. Feb 13 07:51:54.552567 kernel: Tracing variant of Tasks RCU enabled. Feb 13 07:51:54.552572 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 07:51:54.552578 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Feb 13 07:51:54.552583 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Feb 13 07:51:54.552588 kernel: random: crng init done Feb 13 07:51:54.552593 kernel: Console: colour dummy device 80x25 Feb 13 07:51:54.552598 kernel: printk: console [tty0] enabled Feb 13 07:51:54.552603 kernel: printk: console [ttyS1] enabled Feb 13 07:51:54.552608 kernel: ACPI: Core revision 20210730 Feb 13 07:51:54.552613 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Feb 13 07:51:54.552618 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 07:51:54.552624 kernel: DMAR: Host address width 39 Feb 13 07:51:54.552629 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Feb 13 07:51:54.552655 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Feb 13 07:51:54.552660 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Feb 13 07:51:54.552665 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Feb 13 07:51:54.552670 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Feb 13 07:51:54.552675 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Feb 13 07:51:54.552680 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Feb 13 07:51:54.552701 kernel: x2apic enabled Feb 13 07:51:54.552707 kernel: Switched APIC routing to cluster x2apic. Feb 13 07:51:54.552712 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Feb 13 07:51:54.552717 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Feb 13 07:51:54.552722 kernel: CPU0: Thermal monitoring enabled (TM1) Feb 13 07:51:54.552727 kernel: process: using mwait in idle threads Feb 13 07:51:54.552732 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 07:51:54.552737 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 07:51:54.552741 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 07:51:54.552746 kernel: Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks! Feb 13 07:51:54.552752 kernel: Spectre V2 : Mitigation: Enhanced IBRS Feb 13 07:51:54.552757 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 07:51:54.552762 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 07:51:54.552767 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 07:51:54.552772 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 07:51:54.552777 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp Feb 13 07:51:54.552782 kernel: TAA: Mitigation: TSX disabled Feb 13 07:51:54.552786 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Feb 13 07:51:54.552791 kernel: SRBDS: Mitigation: Microcode Feb 13 07:51:54.552796 kernel: GDS: Vulnerable: No microcode Feb 13 07:51:54.552801 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 07:51:54.552807 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 07:51:54.552812 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 07:51:54.552817 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Feb 13 07:51:54.552822 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Feb 13 07:51:54.552827 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 07:51:54.552831 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Feb 13 07:51:54.552836 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Feb 13 07:51:54.552841 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Feb 13 07:51:54.552846 kernel: Freeing SMP alternatives memory: 32K Feb 13 07:51:54.552851 kernel: pid_max: default: 32768 minimum: 301 Feb 13 07:51:54.552856 kernel: LSM: Security Framework initializing Feb 13 07:51:54.552861 kernel: SELinux: Initializing. Feb 13 07:51:54.552866 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 13 07:51:54.552871 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 13 07:51:54.552876 kernel: smpboot: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Feb 13 07:51:54.552881 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 07:51:54.552886 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Feb 13 07:51:54.552891 kernel: ... version: 4 Feb 13 07:51:54.552896 kernel: ... bit width: 48 Feb 13 07:51:54.552901 kernel: ... generic registers: 4 Feb 13 07:51:54.552906 kernel: ... value mask: 0000ffffffffffff Feb 13 07:51:54.552911 kernel: ... max period: 00007fffffffffff Feb 13 07:51:54.552917 kernel: ... fixed-purpose events: 3 Feb 13 07:51:54.552922 kernel: ... event mask: 000000070000000f Feb 13 07:51:54.552927 kernel: signal: max sigframe size: 2032 Feb 13 07:51:54.552932 kernel: rcu: Hierarchical SRCU implementation. Feb 13 07:51:54.552936 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Feb 13 07:51:54.552942 kernel: smp: Bringing up secondary CPUs ... Feb 13 07:51:54.552947 kernel: x86: Booting SMP configuration: Feb 13 07:51:54.552951 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 Feb 13 07:51:54.552957 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Feb 13 07:51:54.552963 kernel: #9 #10 #11 #12 #13 #14 #15 Feb 13 07:51:54.552967 kernel: smp: Brought up 1 node, 16 CPUs Feb 13 07:51:54.552972 kernel: smpboot: Max logical packages: 1 Feb 13 07:51:54.552977 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Feb 13 07:51:54.552982 kernel: devtmpfs: initialized Feb 13 07:51:54.552987 kernel: x86/mm: Memory block size: 128MB Feb 13 07:51:54.552992 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x819e3000-0x819e3fff] (4096 bytes) Feb 13 07:51:54.552997 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Feb 13 07:51:54.553003 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 07:51:54.553008 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Feb 13 07:51:54.553013 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 07:51:54.553018 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 07:51:54.553023 kernel: audit: initializing netlink subsys (disabled) Feb 13 07:51:54.553028 kernel: audit: type=2000 audit(1707810709.040:1): state=initialized audit_enabled=0 res=1 Feb 13 07:51:54.553033 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 07:51:54.553038 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 07:51:54.553043 kernel: cpuidle: using governor menu Feb 13 07:51:54.553049 kernel: ACPI: bus type PCI registered Feb 13 07:51:54.553054 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 07:51:54.553059 kernel: dca service started, version 1.12.1 Feb 13 07:51:54.553064 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Feb 13 07:51:54.553069 kernel: PCI: MMCONFIG at [mem 0xe0000000-0xefffffff] reserved in E820 Feb 13 07:51:54.553074 kernel: PCI: Using configuration type 1 for base access Feb 13 07:51:54.553078 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Feb 13 07:51:54.553083 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 07:51:54.553088 kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 07:51:54.553094 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 07:51:54.553099 kernel: ACPI: Added _OSI(Module Device) Feb 13 07:51:54.553104 kernel: ACPI: Added _OSI(Processor Device) Feb 13 07:51:54.553109 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 07:51:54.553114 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 07:51:54.553119 kernel: ACPI: Added _OSI(Linux-Dell-Video) Feb 13 07:51:54.553124 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Feb 13 07:51:54.553129 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Feb 13 07:51:54.553134 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Feb 13 07:51:54.553139 kernel: ACPI: Dynamic OEM Table Load: Feb 13 07:51:54.553144 kernel: ACPI: SSDT 0xFFFF99C3C0213A00 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Feb 13 07:51:54.553150 kernel: ACPI: \_SB_.PR00: _OSC native thermal LVT Acked Feb 13 07:51:54.553154 kernel: ACPI: Dynamic OEM Table Load: Feb 13 07:51:54.553159 kernel: ACPI: SSDT 0xFFFF99C3C1AE5C00 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Feb 13 07:51:54.553164 kernel: ACPI: Dynamic OEM Table Load: Feb 13 07:51:54.553169 kernel: ACPI: SSDT 0xFFFF99C3C1A5E000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Feb 13 07:51:54.553174 kernel: ACPI: Dynamic OEM Table Load: Feb 13 07:51:54.553179 kernel: ACPI: SSDT 0xFFFF99C3C1A5D000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Feb 13 07:51:54.553184 kernel: ACPI: Dynamic OEM Table Load: Feb 13 07:51:54.553190 kernel: ACPI: SSDT 0xFFFF99C3C014C000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Feb 13 07:51:54.553195 kernel: ACPI: Dynamic OEM Table Load: Feb 13 07:51:54.553199 kernel: ACPI: SSDT 0xFFFF99C3C1AE1400 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Feb 13 07:51:54.553204 kernel: ACPI: Interpreter enabled Feb 13 07:51:54.553209 kernel: ACPI: PM: (supports S0 S5) Feb 13 07:51:54.553214 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 07:51:54.553219 kernel: HEST: Enabling Firmware First mode for corrected errors. Feb 13 07:51:54.553224 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Feb 13 07:51:54.553229 kernel: HEST: Table parsing has been initialized. Feb 13 07:51:54.553235 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Feb 13 07:51:54.553240 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 07:51:54.553245 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Feb 13 07:51:54.553250 kernel: ACPI: PM: Power Resource [USBC] Feb 13 07:51:54.553255 kernel: ACPI: PM: Power Resource [V0PR] Feb 13 07:51:54.553259 kernel: ACPI: PM: Power Resource [V1PR] Feb 13 07:51:54.553264 kernel: ACPI: PM: Power Resource [V2PR] Feb 13 07:51:54.553269 kernel: ACPI: PM: Power Resource [WRST] Feb 13 07:51:54.553274 kernel: ACPI: PM: Power Resource [FN00] Feb 13 07:51:54.553280 kernel: ACPI: PM: Power Resource [FN01] Feb 13 07:51:54.553285 kernel: ACPI: PM: Power Resource [FN02] Feb 13 07:51:54.553290 kernel: ACPI: PM: Power Resource [FN03] Feb 13 07:51:54.553294 kernel: ACPI: PM: Power Resource [FN04] Feb 13 07:51:54.553299 kernel: ACPI: PM: Power Resource [PIN] Feb 13 07:51:54.553304 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Feb 13 07:51:54.553371 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 07:51:54.553416 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Feb 13 07:51:54.553458 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Feb 13 07:51:54.553466 kernel: PCI host bridge to bus 0000:00 Feb 13 07:51:54.553509 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 07:51:54.553546 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 13 07:51:54.553582 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 07:51:54.553617 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Feb 13 07:51:54.553674 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Feb 13 07:51:54.553713 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Feb 13 07:51:54.553762 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 Feb 13 07:51:54.553811 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 Feb 13 07:51:54.553855 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Feb 13 07:51:54.553903 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 Feb 13 07:51:54.553945 kernel: pci 0000:00:08.0: reg 0x10: [mem 0x9551f000-0x9551ffff 64bit] Feb 13 07:51:54.553991 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 Feb 13 07:51:54.554035 kernel: pci 0000:00:12.0: reg 0x10: [mem 0x9551e000-0x9551efff 64bit] Feb 13 07:51:54.554081 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 Feb 13 07:51:54.554124 kernel: pci 0000:00:14.0: reg 0x10: [mem 0x95500000-0x9550ffff 64bit] Feb 13 07:51:54.554167 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Feb 13 07:51:54.554213 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 Feb 13 07:51:54.554257 kernel: pci 0000:00:14.2: reg 0x10: [mem 0x95512000-0x95513fff 64bit] Feb 13 07:51:54.554298 kernel: pci 0000:00:14.2: reg 0x18: [mem 0x9551d000-0x9551dfff 64bit] Feb 13 07:51:54.554345 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 Feb 13 07:51:54.554386 kernel: pci 0000:00:15.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 07:51:54.554431 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 Feb 13 07:51:54.554473 kernel: pci 0000:00:15.1: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 07:51:54.554518 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 Feb 13 07:51:54.554561 kernel: pci 0000:00:16.0: reg 0x10: [mem 0x9551a000-0x9551afff 64bit] Feb 13 07:51:54.554602 kernel: pci 0000:00:16.0: PME# supported from D3hot Feb 13 07:51:54.554650 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 Feb 13 07:51:54.554692 kernel: pci 0000:00:16.1: reg 0x10: [mem 0x95519000-0x95519fff 64bit] Feb 13 07:51:54.554735 kernel: pci 0000:00:16.1: PME# supported from D3hot Feb 13 07:51:54.554780 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 Feb 13 07:51:54.554824 kernel: pci 0000:00:16.4: reg 0x10: [mem 0x95518000-0x95518fff 64bit] Feb 13 07:51:54.554865 kernel: pci 0000:00:16.4: PME# supported from D3hot Feb 13 07:51:54.554909 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 Feb 13 07:51:54.554951 kernel: pci 0000:00:17.0: reg 0x10: [mem 0x95510000-0x95511fff] Feb 13 07:51:54.554991 kernel: pci 0000:00:17.0: reg 0x14: [mem 0x95517000-0x955170ff] Feb 13 07:51:54.555033 kernel: pci 0000:00:17.0: reg 0x18: [io 0x6050-0x6057] Feb 13 07:51:54.555073 kernel: pci 0000:00:17.0: reg 0x1c: [io 0x6040-0x6043] Feb 13 07:51:54.555122 kernel: pci 0000:00:17.0: reg 0x20: [io 0x6020-0x603f] Feb 13 07:51:54.555165 kernel: pci 0000:00:17.0: reg 0x24: [mem 0x95516000-0x955167ff] Feb 13 07:51:54.555206 kernel: pci 0000:00:17.0: PME# supported from D3hot Feb 13 07:51:54.555251 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 Feb 13 07:51:54.555295 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Feb 13 07:51:54.555340 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 Feb 13 07:51:54.555383 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Feb 13 07:51:54.555431 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 Feb 13 07:51:54.555473 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Feb 13 07:51:54.555522 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 Feb 13 07:51:54.555564 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Feb 13 07:51:54.555610 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 Feb 13 07:51:54.555657 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Feb 13 07:51:54.555702 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 Feb 13 07:51:54.555745 kernel: pci 0000:00:1e.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 07:51:54.555793 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 Feb 13 07:51:54.555841 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 Feb 13 07:51:54.555884 kernel: pci 0000:00:1f.4: reg 0x10: [mem 0x95514000-0x955140ff 64bit] Feb 13 07:51:54.555926 kernel: pci 0000:00:1f.4: reg 0x20: [io 0xefa0-0xefbf] Feb 13 07:51:54.555971 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 Feb 13 07:51:54.556014 kernel: pci 0000:00:1f.5: reg 0x10: [mem 0xfe010000-0xfe010fff] Feb 13 07:51:54.556063 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 Feb 13 07:51:54.556110 kernel: pci 0000:01:00.0: reg 0x10: [mem 0x92000000-0x93ffffff 64bit pref] Feb 13 07:51:54.556154 kernel: pci 0000:01:00.0: reg 0x30: [mem 0x95200000-0x952fffff pref] Feb 13 07:51:54.556197 kernel: pci 0000:01:00.0: PME# supported from D3cold Feb 13 07:51:54.556240 kernel: pci 0000:01:00.0: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 13 07:51:54.556283 kernel: pci 0000:01:00.0: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 13 07:51:54.556331 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 Feb 13 07:51:54.556376 kernel: pci 0000:01:00.1: reg 0x10: [mem 0x90000000-0x91ffffff 64bit pref] Feb 13 07:51:54.556422 kernel: pci 0000:01:00.1: reg 0x30: [mem 0x95100000-0x951fffff pref] Feb 13 07:51:54.556465 kernel: pci 0000:01:00.1: PME# supported from D3cold Feb 13 07:51:54.556509 kernel: pci 0000:01:00.1: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 13 07:51:54.556552 kernel: pci 0000:01:00.1: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 13 07:51:54.556595 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 07:51:54.556640 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 13 07:51:54.556682 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 07:51:54.556725 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 13 07:51:54.556775 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 Feb 13 07:51:54.556820 kernel: pci 0000:03:00.0: reg 0x10: [mem 0x95400000-0x9547ffff] Feb 13 07:51:54.556863 kernel: pci 0000:03:00.0: reg 0x18: [io 0x5000-0x501f] Feb 13 07:51:54.556926 kernel: pci 0000:03:00.0: reg 0x1c: [mem 0x95480000-0x95483fff] Feb 13 07:51:54.556968 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 07:51:54.557010 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 13 07:51:54.557052 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 13 07:51:54.557095 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 13 07:51:54.557144 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 Feb 13 07:51:54.557187 kernel: pci 0000:04:00.0: reg 0x10: [mem 0x95300000-0x9537ffff] Feb 13 07:51:54.557232 kernel: pci 0000:04:00.0: reg 0x18: [io 0x4000-0x401f] Feb 13 07:51:54.557274 kernel: pci 0000:04:00.0: reg 0x1c: [mem 0x95380000-0x95383fff] Feb 13 07:51:54.557354 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Feb 13 07:51:54.557418 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 13 07:51:54.557460 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 13 07:51:54.557503 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 13 07:51:54.557546 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 13 07:51:54.557593 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 Feb 13 07:51:54.557662 kernel: pci 0000:06:00.0: enabling Extended Tags Feb 13 07:51:54.557726 kernel: pci 0000:06:00.0: supports D1 D2 Feb 13 07:51:54.557770 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 07:51:54.557812 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 13 07:51:54.557854 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 13 07:51:54.557899 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 13 07:51:54.557944 kernel: pci_bus 0000:07: extended config space not accessible Feb 13 07:51:54.557995 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 Feb 13 07:51:54.558040 kernel: pci 0000:07:00.0: reg 0x10: [mem 0x94000000-0x94ffffff] Feb 13 07:51:54.558086 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x95000000-0x9501ffff] Feb 13 07:51:54.558131 kernel: pci 0000:07:00.0: reg 0x18: [io 0x3000-0x307f] Feb 13 07:51:54.558176 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 07:51:54.558223 kernel: pci 0000:07:00.0: supports D1 D2 Feb 13 07:51:54.558269 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 07:51:54.558312 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 13 07:51:54.558355 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 13 07:51:54.558398 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 13 07:51:54.558406 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Feb 13 07:51:54.558412 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Feb 13 07:51:54.558419 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Feb 13 07:51:54.558424 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Feb 13 07:51:54.558429 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Feb 13 07:51:54.558435 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Feb 13 07:51:54.558440 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Feb 13 07:51:54.558445 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Feb 13 07:51:54.558450 kernel: iommu: Default domain type: Translated Feb 13 07:51:54.558456 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 07:51:54.558500 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Feb 13 07:51:54.558546 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 07:51:54.558591 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Feb 13 07:51:54.558599 kernel: vgaarb: loaded Feb 13 07:51:54.558604 kernel: pps_core: LinuxPPS API ver. 1 registered Feb 13 07:51:54.558610 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Feb 13 07:51:54.558615 kernel: PTP clock support registered Feb 13 07:51:54.558620 kernel: PCI: Using ACPI for IRQ routing Feb 13 07:51:54.558626 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 07:51:54.558633 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Feb 13 07:51:54.558662 kernel: e820: reserve RAM buffer [mem 0x819e3000-0x83ffffff] Feb 13 07:51:54.558667 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Feb 13 07:51:54.558672 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Feb 13 07:51:54.558677 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Feb 13 07:51:54.558699 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Feb 13 07:51:54.558704 kernel: clocksource: Switched to clocksource tsc-early Feb 13 07:51:54.558709 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 07:51:54.558715 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 07:51:54.558720 kernel: pnp: PnP ACPI init Feb 13 07:51:54.558763 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Feb 13 07:51:54.558805 kernel: pnp 00:02: [dma 0 disabled] Feb 13 07:51:54.558846 kernel: pnp 00:03: [dma 0 disabled] Feb 13 07:51:54.558887 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Feb 13 07:51:54.558925 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Feb 13 07:51:54.558966 kernel: system 00:05: [io 0x1854-0x1857] has been reserved Feb 13 07:51:54.559012 kernel: system 00:06: [mem 0xfed10000-0xfed17fff] has been reserved Feb 13 07:51:54.559049 kernel: system 00:06: [mem 0xfed18000-0xfed18fff] has been reserved Feb 13 07:51:54.559087 kernel: system 00:06: [mem 0xfed19000-0xfed19fff] has been reserved Feb 13 07:51:54.559125 kernel: system 00:06: [mem 0xe0000000-0xefffffff] has been reserved Feb 13 07:51:54.559162 kernel: system 00:06: [mem 0xfed20000-0xfed3ffff] has been reserved Feb 13 07:51:54.559201 kernel: system 00:06: [mem 0xfed90000-0xfed93fff] could not be reserved Feb 13 07:51:54.559238 kernel: system 00:06: [mem 0xfed45000-0xfed8ffff] has been reserved Feb 13 07:51:54.559277 kernel: system 00:06: [mem 0xfee00000-0xfeefffff] could not be reserved Feb 13 07:51:54.559317 kernel: system 00:07: [io 0x1800-0x18fe] could not be reserved Feb 13 07:51:54.559355 kernel: system 00:07: [mem 0xfd000000-0xfd69ffff] has been reserved Feb 13 07:51:54.559392 kernel: system 00:07: [mem 0xfd6c0000-0xfd6cffff] has been reserved Feb 13 07:51:54.559430 kernel: system 00:07: [mem 0xfd6f0000-0xfdffffff] has been reserved Feb 13 07:51:54.559466 kernel: system 00:07: [mem 0xfe000000-0xfe01ffff] could not be reserved Feb 13 07:51:54.559503 kernel: system 00:07: [mem 0xfe200000-0xfe7fffff] has been reserved Feb 13 07:51:54.559542 kernel: system 00:07: [mem 0xff000000-0xffffffff] has been reserved Feb 13 07:51:54.559582 kernel: system 00:08: [io 0x2000-0x20fe] has been reserved Feb 13 07:51:54.559590 kernel: pnp: PnP ACPI: found 10 devices Feb 13 07:51:54.559595 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 07:51:54.559601 kernel: NET: Registered PF_INET protocol family Feb 13 07:51:54.559606 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 07:51:54.559612 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Feb 13 07:51:54.559617 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 07:51:54.559624 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 07:51:54.559629 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Feb 13 07:51:54.559659 kernel: TCP: Hash tables configured (established 262144 bind 65536) Feb 13 07:51:54.559664 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 07:51:54.559670 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 07:51:54.559675 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 07:51:54.559680 kernel: NET: Registered PF_XDP protocol family Feb 13 07:51:54.559743 kernel: pci 0000:00:15.0: BAR 0: assigned [mem 0x95515000-0x95515fff 64bit] Feb 13 07:51:54.559787 kernel: pci 0000:00:15.1: BAR 0: assigned [mem 0x9551b000-0x9551bfff 64bit] Feb 13 07:51:54.559830 kernel: pci 0000:00:1e.0: BAR 0: assigned [mem 0x9551c000-0x9551cfff 64bit] Feb 13 07:51:54.559873 kernel: pci 0000:01:00.0: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 13 07:51:54.559917 kernel: pci 0000:01:00.0: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 13 07:51:54.559960 kernel: pci 0000:01:00.1: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 13 07:51:54.560004 kernel: pci 0000:01:00.1: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 13 07:51:54.560046 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 07:51:54.560088 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 13 07:51:54.560132 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 07:51:54.560173 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 13 07:51:54.560214 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 13 07:51:54.560256 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 13 07:51:54.560299 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 13 07:51:54.560342 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 13 07:51:54.560383 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 13 07:51:54.560426 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 13 07:51:54.560468 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 13 07:51:54.560511 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 13 07:51:54.560554 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 13 07:51:54.560597 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 13 07:51:54.560662 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 13 07:51:54.560706 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 13 07:51:54.560750 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 13 07:51:54.560788 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Feb 13 07:51:54.560827 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 13 07:51:54.560863 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 13 07:51:54.560901 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 13 07:51:54.560937 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Feb 13 07:51:54.560973 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Feb 13 07:51:54.561018 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Feb 13 07:51:54.561059 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 07:51:54.561103 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Feb 13 07:51:54.561144 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Feb 13 07:51:54.561190 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Feb 13 07:51:54.561232 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Feb 13 07:51:54.561275 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Feb 13 07:51:54.561316 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Feb 13 07:51:54.561356 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Feb 13 07:51:54.561398 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Feb 13 07:51:54.561406 kernel: PCI: CLS 64 bytes, default 64 Feb 13 07:51:54.561412 kernel: DMAR: No ATSR found Feb 13 07:51:54.561417 kernel: DMAR: No SATC found Feb 13 07:51:54.561423 kernel: DMAR: dmar0: Using Queued invalidation Feb 13 07:51:54.561464 kernel: pci 0000:00:00.0: Adding to iommu group 0 Feb 13 07:51:54.561510 kernel: pci 0000:00:01.0: Adding to iommu group 1 Feb 13 07:51:54.561553 kernel: pci 0000:00:08.0: Adding to iommu group 2 Feb 13 07:51:54.561596 kernel: pci 0000:00:12.0: Adding to iommu group 3 Feb 13 07:51:54.561641 kernel: pci 0000:00:14.0: Adding to iommu group 4 Feb 13 07:51:54.561684 kernel: pci 0000:00:14.2: Adding to iommu group 4 Feb 13 07:51:54.561726 kernel: pci 0000:00:15.0: Adding to iommu group 5 Feb 13 07:51:54.561767 kernel: pci 0000:00:15.1: Adding to iommu group 5 Feb 13 07:51:54.561810 kernel: pci 0000:00:16.0: Adding to iommu group 6 Feb 13 07:51:54.561854 kernel: pci 0000:00:16.1: Adding to iommu group 6 Feb 13 07:51:54.561915 kernel: pci 0000:00:16.4: Adding to iommu group 6 Feb 13 07:51:54.561956 kernel: pci 0000:00:17.0: Adding to iommu group 7 Feb 13 07:51:54.561998 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Feb 13 07:51:54.562039 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Feb 13 07:51:54.562080 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Feb 13 07:51:54.562124 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Feb 13 07:51:54.562165 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Feb 13 07:51:54.562209 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Feb 13 07:51:54.562250 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Feb 13 07:51:54.562292 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Feb 13 07:51:54.562332 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Feb 13 07:51:54.562376 kernel: pci 0000:01:00.0: Adding to iommu group 1 Feb 13 07:51:54.562419 kernel: pci 0000:01:00.1: Adding to iommu group 1 Feb 13 07:51:54.562461 kernel: pci 0000:03:00.0: Adding to iommu group 15 Feb 13 07:51:54.562506 kernel: pci 0000:04:00.0: Adding to iommu group 16 Feb 13 07:51:54.562552 kernel: pci 0000:06:00.0: Adding to iommu group 17 Feb 13 07:51:54.562598 kernel: pci 0000:07:00.0: Adding to iommu group 17 Feb 13 07:51:54.562606 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Feb 13 07:51:54.562612 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 13 07:51:54.562617 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Feb 13 07:51:54.562623 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Feb 13 07:51:54.562628 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Feb 13 07:51:54.562657 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Feb 13 07:51:54.562664 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Feb 13 07:51:54.562723 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Feb 13 07:51:54.562732 kernel: Initialise system trusted keyrings Feb 13 07:51:54.562737 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Feb 13 07:51:54.562742 kernel: Key type asymmetric registered Feb 13 07:51:54.562747 kernel: Asymmetric key parser 'x509' registered Feb 13 07:51:54.562753 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Feb 13 07:51:54.562758 kernel: io scheduler mq-deadline registered Feb 13 07:51:54.562765 kernel: io scheduler kyber registered Feb 13 07:51:54.562770 kernel: io scheduler bfq registered Feb 13 07:51:54.562813 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Feb 13 07:51:54.562855 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Feb 13 07:51:54.562898 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Feb 13 07:51:54.562940 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Feb 13 07:51:54.562982 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Feb 13 07:51:54.563023 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Feb 13 07:51:54.563071 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Feb 13 07:51:54.563079 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Feb 13 07:51:54.563084 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Feb 13 07:51:54.563090 kernel: pstore: Registered erst as persistent store backend Feb 13 07:51:54.563095 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 07:51:54.563101 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 07:51:54.563106 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 07:51:54.563111 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Feb 13 07:51:54.563118 kernel: hpet_acpi_add: no address or irqs in _CRS Feb 13 07:51:54.563161 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Feb 13 07:51:54.563169 kernel: i8042: PNP: No PS/2 controller found. Feb 13 07:51:54.563206 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Feb 13 07:51:54.563245 kernel: rtc_cmos rtc_cmos: registered as rtc0 Feb 13 07:51:54.563283 kernel: rtc_cmos rtc_cmos: setting system clock to 2024-02-13T07:51:53 UTC (1707810713) Feb 13 07:51:54.563321 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Feb 13 07:51:54.563329 kernel: fail to initialize ptp_kvm Feb 13 07:51:54.563335 kernel: intel_pstate: Intel P-state driver initializing Feb 13 07:51:54.563341 kernel: intel_pstate: Disabling energy efficiency optimization Feb 13 07:51:54.563346 kernel: intel_pstate: HWP enabled Feb 13 07:51:54.563352 kernel: vesafb: mode is 1024x768x8, linelength=1024, pages=0 Feb 13 07:51:54.563357 kernel: vesafb: scrolling: redraw Feb 13 07:51:54.563362 kernel: vesafb: Pseudocolor: size=0:8:8:8, shift=0:0:0:0 Feb 13 07:51:54.563368 kernel: vesafb: framebuffer at 0x94000000, mapped to 0x00000000d3ec108e, using 768k, total 768k Feb 13 07:51:54.563373 kernel: Console: switching to colour frame buffer device 128x48 Feb 13 07:51:54.563378 kernel: fb0: VESA VGA frame buffer device Feb 13 07:51:54.563385 kernel: NET: Registered PF_INET6 protocol family Feb 13 07:51:54.563390 kernel: Segment Routing with IPv6 Feb 13 07:51:54.563395 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 07:51:54.563400 kernel: NET: Registered PF_PACKET protocol family Feb 13 07:51:54.563406 kernel: Key type dns_resolver registered Feb 13 07:51:54.563411 kernel: microcode: sig=0x906ed, pf=0x2, revision=0xf4 Feb 13 07:51:54.563417 kernel: microcode: Microcode Update Driver: v2.2. Feb 13 07:51:54.563422 kernel: IPI shorthand broadcast: enabled Feb 13 07:51:54.563427 kernel: sched_clock: Marking stable (1731512088, 1339386450)->(4491168381, -1420269843) Feb 13 07:51:54.563433 kernel: registered taskstats version 1 Feb 13 07:51:54.563438 kernel: Loading compiled-in X.509 certificates Feb 13 07:51:54.563444 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.148-flatcar: 253e5c5c936b12e2ff2626e7f3214deb753330c8' Feb 13 07:51:54.563449 kernel: Key type .fscrypt registered Feb 13 07:51:54.563454 kernel: Key type fscrypt-provisioning registered Feb 13 07:51:54.563460 kernel: pstore: Using crash dump compression: deflate Feb 13 07:51:54.563465 kernel: ima: Allocated hash algorithm: sha1 Feb 13 07:51:54.563470 kernel: ima: No architecture policies found Feb 13 07:51:54.563476 kernel: Freeing unused kernel image (initmem) memory: 45496K Feb 13 07:51:54.563482 kernel: Write protecting the kernel read-only data: 28672k Feb 13 07:51:54.563487 kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Feb 13 07:51:54.563492 kernel: Freeing unused kernel image (rodata/data gap) memory: 636K Feb 13 07:51:54.563498 kernel: Run /init as init process Feb 13 07:51:54.563503 kernel: with arguments: Feb 13 07:51:54.563508 kernel: /init Feb 13 07:51:54.563514 kernel: with environment: Feb 13 07:51:54.563519 kernel: HOME=/ Feb 13 07:51:54.563524 kernel: TERM=linux Feb 13 07:51:54.563530 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 07:51:54.563536 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 13 07:51:54.563543 systemd[1]: Detected architecture x86-64. Feb 13 07:51:54.563549 systemd[1]: Running in initrd. Feb 13 07:51:54.563554 systemd[1]: No hostname configured, using default hostname. Feb 13 07:51:54.563559 systemd[1]: Hostname set to . Feb 13 07:51:54.563565 systemd[1]: Initializing machine ID from random generator. Feb 13 07:51:54.563571 systemd[1]: Queued start job for default target initrd.target. Feb 13 07:51:54.563577 systemd[1]: Started systemd-ask-password-console.path. Feb 13 07:51:54.563582 systemd[1]: Reached target cryptsetup.target. Feb 13 07:51:54.563588 systemd[1]: Reached target paths.target. Feb 13 07:51:54.563593 systemd[1]: Reached target slices.target. Feb 13 07:51:54.563598 systemd[1]: Reached target swap.target. Feb 13 07:51:54.563604 systemd[1]: Reached target timers.target. Feb 13 07:51:54.563609 systemd[1]: Listening on iscsid.socket. Feb 13 07:51:54.563616 systemd[1]: Listening on iscsiuio.socket. Feb 13 07:51:54.563622 systemd[1]: Listening on systemd-journald-audit.socket. Feb 13 07:51:54.563627 systemd[1]: Listening on systemd-journald-dev-log.socket. Feb 13 07:51:54.563655 systemd[1]: Listening on systemd-journald.socket. Feb 13 07:51:54.563661 kernel: tsc: Refined TSC clocksource calibration: 3407.999 MHz Feb 13 07:51:54.563667 systemd[1]: Listening on systemd-networkd.socket. Feb 13 07:51:54.563672 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd336761, max_idle_ns: 440795243819 ns Feb 13 07:51:54.563678 kernel: clocksource: Switched to clocksource tsc Feb 13 07:51:54.563703 systemd[1]: Listening on systemd-udevd-control.socket. Feb 13 07:51:54.563709 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 13 07:51:54.563714 systemd[1]: Reached target sockets.target. Feb 13 07:51:54.563720 systemd[1]: Starting kmod-static-nodes.service... Feb 13 07:51:54.563725 systemd[1]: Finished network-cleanup.service. Feb 13 07:51:54.563731 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 07:51:54.563736 systemd[1]: Starting systemd-journald.service... Feb 13 07:51:54.563742 systemd[1]: Starting systemd-modules-load.service... Feb 13 07:51:54.563749 systemd-journald[267]: Journal started Feb 13 07:51:54.563776 systemd-journald[267]: Runtime Journal (/run/log/journal/f37fc336e4674d39a2640ee3ef27a400) is 8.0M, max 640.1M, 632.1M free. Feb 13 07:51:54.566471 systemd-modules-load[268]: Inserted module 'overlay' Feb 13 07:51:54.596301 kernel: audit: type=1334 audit(1707810714.572:2): prog-id=6 op=LOAD Feb 13 07:51:54.596311 systemd[1]: Starting systemd-resolved.service... Feb 13 07:51:54.572000 audit: BPF prog-id=6 op=LOAD Feb 13 07:51:54.641677 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 07:51:54.641694 systemd[1]: Starting systemd-vconsole-setup.service... Feb 13 07:51:54.673676 kernel: Bridge firewalling registered Feb 13 07:51:54.673693 systemd[1]: Started systemd-journald.service. Feb 13 07:51:54.688136 systemd-modules-load[268]: Inserted module 'br_netfilter' Feb 13 07:51:54.737310 kernel: audit: type=1130 audit(1707810714.695:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.695000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.694565 systemd-resolved[270]: Positive Trust Anchors: Feb 13 07:51:54.812818 kernel: SCSI subsystem initialized Feb 13 07:51:54.812831 kernel: audit: type=1130 audit(1707810714.749:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.812842 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 07:51:54.749000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.694571 systemd-resolved[270]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 07:51:54.913562 kernel: device-mapper: uevent: version 1.0.3 Feb 13 07:51:54.913573 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Feb 13 07:51:54.913581 kernel: audit: type=1130 audit(1707810714.870:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.870000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.694590 systemd-resolved[270]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 13 07:51:54.986883 kernel: audit: type=1130 audit(1707810714.921:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.921000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.695906 systemd[1]: Finished kmod-static-nodes.service. Feb 13 07:51:54.995000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.696214 systemd-resolved[270]: Defaulting to hostname 'linux'. Feb 13 07:51:55.094853 kernel: audit: type=1130 audit(1707810714.995:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:55.094866 kernel: audit: type=1130 audit(1707810715.048:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:55.048000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:54.749774 systemd[1]: Started systemd-resolved.service. Feb 13 07:51:54.871045 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 07:51:54.913911 systemd-modules-load[268]: Inserted module 'dm_multipath' Feb 13 07:51:54.921919 systemd[1]: Finished systemd-modules-load.service. Feb 13 07:51:54.995977 systemd[1]: Finished systemd-vconsole-setup.service. Feb 13 07:51:55.048911 systemd[1]: Reached target nss-lookup.target. Feb 13 07:51:55.104278 systemd[1]: Starting dracut-cmdline-ask.service... Feb 13 07:51:55.124399 systemd[1]: Starting systemd-sysctl.service... Feb 13 07:51:55.124819 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Feb 13 07:51:55.127763 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Feb 13 07:51:55.127000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:55.128466 systemd[1]: Finished systemd-sysctl.service. Feb 13 07:51:55.176633 kernel: audit: type=1130 audit(1707810715.127:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:55.189000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:55.189975 systemd[1]: Finished dracut-cmdline-ask.service. Feb 13 07:51:55.254733 kernel: audit: type=1130 audit(1707810715.189:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:55.245000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:55.246277 systemd[1]: Starting dracut-cmdline.service... Feb 13 07:51:55.270744 dracut-cmdline[293]: dracut-dracut-053 Feb 13 07:51:55.270744 dracut-cmdline[293]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LA Feb 13 07:51:55.270744 dracut-cmdline[293]: BEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 07:51:55.336726 kernel: Loading iSCSI transport class v2.0-870. Feb 13 07:51:55.336738 kernel: iscsi: registered transport (tcp) Feb 13 07:51:55.383597 kernel: iscsi: registered transport (qla4xxx) Feb 13 07:51:55.383618 kernel: QLogic iSCSI HBA Driver Feb 13 07:51:55.399664 systemd[1]: Finished dracut-cmdline.service. Feb 13 07:51:55.399000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:55.400229 systemd[1]: Starting dracut-pre-udev.service... Feb 13 07:51:55.455688 kernel: raid6: avx2x4 gen() 47868 MB/s Feb 13 07:51:55.490691 kernel: raid6: avx2x4 xor() 14926 MB/s Feb 13 07:51:55.525663 kernel: raid6: avx2x2 gen() 53857 MB/s Feb 13 07:51:55.560691 kernel: raid6: avx2x2 xor() 32794 MB/s Feb 13 07:51:55.595664 kernel: raid6: avx2x1 gen() 45414 MB/s Feb 13 07:51:55.629658 kernel: raid6: avx2x1 xor() 28501 MB/s Feb 13 07:51:55.663661 kernel: raid6: sse2x4 gen() 21821 MB/s Feb 13 07:51:55.697670 kernel: raid6: sse2x4 xor() 11980 MB/s Feb 13 07:51:55.731691 kernel: raid6: sse2x2 gen() 22120 MB/s Feb 13 07:51:55.765668 kernel: raid6: sse2x2 xor() 13738 MB/s Feb 13 07:51:55.799664 kernel: raid6: sse2x1 gen() 18689 MB/s Feb 13 07:51:55.851256 kernel: raid6: sse2x1 xor() 9128 MB/s Feb 13 07:51:55.851271 kernel: raid6: using algorithm avx2x2 gen() 53857 MB/s Feb 13 07:51:55.851279 kernel: raid6: .... xor() 32794 MB/s, rmw enabled Feb 13 07:51:55.869319 kernel: raid6: using avx2x2 recovery algorithm Feb 13 07:51:55.915636 kernel: xor: automatically using best checksumming function avx Feb 13 07:51:55.992665 kernel: Btrfs loaded, crc32c=crc32c-intel, zoned=no, fsverity=no Feb 13 07:51:55.997821 systemd[1]: Finished dracut-pre-udev.service. Feb 13 07:51:56.006000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:56.006000 audit: BPF prog-id=7 op=LOAD Feb 13 07:51:56.006000 audit: BPF prog-id=8 op=LOAD Feb 13 07:51:56.007616 systemd[1]: Starting systemd-udevd.service... Feb 13 07:51:56.015928 systemd-udevd[473]: Using default interface naming scheme 'v252'. Feb 13 07:51:56.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:56.021854 systemd[1]: Started systemd-udevd.service. Feb 13 07:51:56.062755 dracut-pre-trigger[485]: rd.md=0: removing MD RAID activation Feb 13 07:51:56.039322 systemd[1]: Starting dracut-pre-trigger.service... Feb 13 07:51:56.080000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:56.066515 systemd[1]: Finished dracut-pre-trigger.service. Feb 13 07:51:56.081876 systemd[1]: Starting systemd-udev-trigger.service... Feb 13 07:51:56.131032 systemd[1]: Finished systemd-udev-trigger.service. Feb 13 07:51:56.130000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:56.158644 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 07:51:56.194434 kernel: ACPI: bus type USB registered Feb 13 07:51:56.194473 kernel: usbcore: registered new interface driver usbfs Feb 13 07:51:56.194483 kernel: usbcore: registered new interface driver hub Feb 13 07:51:56.212114 kernel: usbcore: registered new device driver usb Feb 13 07:51:56.230638 kernel: libata version 3.00 loaded. Feb 13 07:51:56.254670 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 07:51:56.254709 kernel: AES CTR mode by8 optimization enabled Feb 13 07:51:56.254718 kernel: ahci 0000:00:17.0: version 3.0 Feb 13 07:51:56.254799 kernel: mlx5_core 0000:01:00.0: firmware version: 14.27.1016 Feb 13 07:51:56.254857 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 13 07:51:56.345728 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Feb 13 07:51:56.345760 kernel: ahci 0000:00:17.0: AHCI 0001.0301 32 slots 7 ports 6 Gbps 0x7f impl SATA mode Feb 13 07:51:56.345857 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Feb 13 07:51:56.345873 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Feb 13 07:51:56.383382 kernel: scsi host0: ahci Feb 13 07:51:56.410227 kernel: scsi host1: ahci Feb 13 07:51:56.410529 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 13 07:51:56.410606 kernel: scsi host2: ahci Feb 13 07:51:56.419717 kernel: pps pps0: new PPS source ptp0 Feb 13 07:51:56.419802 kernel: igb 0000:03:00.0: added PHC on eth0 Feb 13 07:51:56.419862 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 13 07:51:56.419922 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:75:24 Feb 13 07:51:56.419970 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Feb 13 07:51:56.420016 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 13 07:51:56.426326 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Feb 13 07:51:56.439682 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Feb 13 07:51:56.439813 kernel: scsi host3: ahci Feb 13 07:51:56.461659 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 13 07:51:56.461830 kernel: pps pps1: new PPS source ptp1 Feb 13 07:51:56.461906 kernel: igb 0000:04:00.0: added PHC on eth1 Feb 13 07:51:56.461986 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 13 07:51:56.462078 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:75:25 Feb 13 07:51:56.462150 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Feb 13 07:51:56.462221 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 13 07:51:56.484255 kernel: scsi host4: ahci Feb 13 07:51:56.484306 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Feb 13 07:51:56.513865 kernel: scsi host5: ahci Feb 13 07:51:56.513891 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Feb 13 07:51:56.513976 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Feb 13 07:51:56.514635 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Feb 13 07:51:56.561954 kernel: scsi host6: ahci Feb 13 07:51:56.562035 kernel: hub 1-0:1.0: USB hub found Feb 13 07:51:56.562101 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 Feb 13 07:51:56.584788 kernel: hub 1-0:1.0: 16 ports detected Feb 13 07:51:56.584865 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 Feb 13 07:51:56.622344 kernel: hub 2-0:1.0: USB hub found Feb 13 07:51:56.622423 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 Feb 13 07:51:56.637031 kernel: hub 2-0:1.0: 10 ports detected Feb 13 07:51:56.637105 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 Feb 13 07:51:56.665128 kernel: usb: port power management may be unreliable Feb 13 07:51:56.665146 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 Feb 13 07:51:56.867789 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Feb 13 07:51:56.867813 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 Feb 13 07:51:56.971026 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 Feb 13 07:51:56.990685 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Feb 13 07:51:57.024347 kernel: hub 1-14:1.0: USB hub found Feb 13 07:51:57.024435 kernel: hub 1-14:1.0: 4 ports detected Feb 13 07:51:57.043691 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 13 07:51:57.263733 kernel: mlx5_core 0000:01:00.0: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 13 07:51:57.307525 kernel: mlx5_core 0000:01:00.1: firmware version: 14.27.1016 Feb 13 07:51:57.307676 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 13 07:51:57.307702 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 13 07:51:57.307767 kernel: ata6: SATA link down (SStatus 0 SControl 300) Feb 13 07:51:57.343673 kernel: ata3: SATA link down (SStatus 0 SControl 300) Feb 13 07:51:57.359662 kernel: ata1.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU004, max UDMA/133 Feb 13 07:51:57.377635 kernel: ata4: SATA link down (SStatus 0 SControl 300) Feb 13 07:51:57.393711 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Feb 13 07:51:57.393755 kernel: ata5: SATA link down (SStatus 0 SControl 300) Feb 13 07:51:57.426644 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 13 07:51:57.443636 kernel: ata7: SATA link down (SStatus 0 SControl 300) Feb 13 07:51:57.459670 kernel: ata2.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Feb 13 07:51:57.510527 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 13 07:51:57.510549 kernel: ata1.00: Features: NCQ-prio Feb 13 07:51:57.510664 kernel: hid: raw HID events driver (C) Jiri Kosina Feb 13 07:51:57.526699 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 13 07:51:57.558037 kernel: ata2.00: Features: NCQ-prio Feb 13 07:51:57.575710 kernel: ata1.00: configured for UDMA/133 Feb 13 07:51:57.575742 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5200_MTFD U004 PQ: 0 ANSI: 5 Feb 13 07:51:57.594658 kernel: ata2.00: configured for UDMA/133 Feb 13 07:51:57.609680 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Feb 13 07:51:57.609776 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Feb 13 07:51:57.649689 kernel: port_module: 9 callbacks suppressed Feb 13 07:51:57.649709 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Feb 13 07:51:57.704696 kernel: usbcore: registered new interface driver usbhid Feb 13 07:51:57.704729 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 13 07:51:57.704797 kernel: usbhid: USB HID core driver Feb 13 07:51:57.775637 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Feb 13 07:51:57.792517 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 07:51:57.792536 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 07:51:57.827678 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 13 07:51:57.827769 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 13 07:51:57.827842 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Feb 13 07:51:57.827920 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Feb 13 07:51:57.827931 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Feb 13 07:51:57.828004 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Feb 13 07:51:57.880650 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Feb 13 07:51:57.880735 kernel: sd 1:0:0:0: [sdb] Write Protect is off Feb 13 07:51:57.880803 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 13 07:51:57.915938 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Feb 13 07:51:57.916114 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 13 07:51:57.926639 kernel: mlx5_core 0000:01:00.1: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 13 07:51:57.951690 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Feb 13 07:51:58.078570 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 07:51:58.078593 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 13 07:51:58.136742 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 13 07:51:58.136762 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 07:51:58.136774 kernel: GPT:9289727 != 937703087 Feb 13 07:51:58.169073 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 07:51:58.169095 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 13 07:51:58.169107 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 13 07:51:58.204073 kernel: GPT:9289727 != 937703087 Feb 13 07:51:58.204086 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 13 07:51:58.204096 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 07:51:58.254684 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 07:51:58.270742 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Feb 13 07:51:58.307637 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Feb 13 07:51:58.331902 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Feb 13 07:51:58.375903 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Feb 13 07:51:58.375985 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sdb6 scanned by (udev-worker) (525) Feb 13 07:51:58.373880 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Feb 13 07:51:58.388109 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Feb 13 07:51:58.393795 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Feb 13 07:51:58.430853 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Feb 13 07:51:58.448801 systemd[1]: Starting disk-uuid.service... Feb 13 07:51:58.470728 disk-uuid[690]: Primary Header is updated. Feb 13 07:51:58.470728 disk-uuid[690]: Secondary Entries is updated. Feb 13 07:51:58.470728 disk-uuid[690]: Secondary Header is updated. Feb 13 07:51:58.526702 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 07:51:58.526727 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 07:51:58.526742 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 07:51:58.526759 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 07:51:58.552386 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 07:51:58.570663 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 07:51:59.551517 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 07:51:59.570637 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 07:51:59.570686 disk-uuid[692]: The operation has completed successfully. Feb 13 07:51:59.611704 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 07:51:59.706720 kernel: audit: type=1130 audit(1707810719.618:19): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:59.706735 kernel: audit: type=1131 audit(1707810719.618:20): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:59.618000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:59.618000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:59.611749 systemd[1]: Finished disk-uuid.service. Feb 13 07:51:59.735722 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 07:51:59.619357 systemd[1]: Starting verity-setup.service... Feb 13 07:51:59.766355 systemd[1]: Found device dev-mapper-usr.device. Feb 13 07:51:59.767225 systemd[1]: Mounting sysusr-usr.mount... Feb 13 07:51:59.786834 systemd[1]: Finished verity-setup.service. Feb 13 07:51:59.805000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:59.850671 kernel: audit: type=1130 audit(1707810719.805:21): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:59.877222 systemd[1]: Mounted sysusr-usr.mount. Feb 13 07:51:59.896517 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Feb 13 07:51:59.896530 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Feb 13 07:51:59.877386 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Feb 13 07:51:59.975734 kernel: BTRFS info (device sdb6): using free space tree Feb 13 07:51:59.975748 kernel: BTRFS info (device sdb6): has skinny extents Feb 13 07:51:59.975755 kernel: BTRFS info (device sdb6): enabling ssd optimizations Feb 13 07:51:59.877806 systemd[1]: Starting ignition-setup.service... Feb 13 07:51:59.963893 systemd[1]: Starting parse-ip-for-networkd.service... Feb 13 07:51:59.999000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:51:59.984092 systemd[1]: Finished ignition-setup.service. Feb 13 07:52:00.059770 kernel: audit: type=1130 audit(1707810719.999:22): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.000299 systemd[1]: Starting ignition-fetch-offline.service... Feb 13 07:52:00.067000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.052941 systemd[1]: Finished parse-ip-for-networkd.service. Feb 13 07:52:00.134726 kernel: audit: type=1130 audit(1707810720.067:23): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.134747 kernel: audit: type=1334 audit(1707810720.113:24): prog-id=9 op=LOAD Feb 13 07:52:00.113000 audit: BPF prog-id=9 op=LOAD Feb 13 07:52:00.127869 ignition[867]: Ignition 2.14.0 Feb 13 07:52:00.114731 systemd[1]: Starting systemd-networkd.service... Feb 13 07:52:00.127873 ignition[867]: Stage: fetch-offline Feb 13 07:52:00.203670 kernel: audit: type=1130 audit(1707810720.156:25): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.156000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.148506 systemd-networkd[880]: lo: Link UP Feb 13 07:52:00.127896 ignition[867]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 07:52:00.148508 systemd-networkd[880]: lo: Gained carrier Feb 13 07:52:00.127909 ignition[867]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 07:52:00.148800 systemd-networkd[880]: Enumeration completed Feb 13 07:52:00.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.136798 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 07:52:00.373281 kernel: audit: type=1130 audit(1707810720.247:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.373299 kernel: audit: type=1130 audit(1707810720.303:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.373307 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 07:52:00.303000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.148904 systemd[1]: Started systemd-networkd.service. Feb 13 07:52:00.396315 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f1np1: link becomes ready Feb 13 07:52:00.136861 ignition[867]: parsed url from cmdline: "" Feb 13 07:52:00.149539 systemd-networkd[880]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 07:52:00.136863 ignition[867]: no config URL provided Feb 13 07:52:00.428000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.156611 unknown[867]: fetched base config from "system" Feb 13 07:52:00.451733 iscsid[909]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Feb 13 07:52:00.451733 iscsid[909]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log Feb 13 07:52:00.451733 iscsid[909]: into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Feb 13 07:52:00.451733 iscsid[909]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Feb 13 07:52:00.451733 iscsid[909]: If using hardware iscsi like qla4xxx this message can be ignored. Feb 13 07:52:00.451733 iscsid[909]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Feb 13 07:52:00.451733 iscsid[909]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Feb 13 07:52:00.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.136868 ignition[867]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 07:52:00.629827 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 13 07:52:00.619000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:00.156615 unknown[867]: fetched user config from "system" Feb 13 07:52:00.136899 ignition[867]: parsing config with SHA512: b336fede4491091083b08ff6ee0bb2dc831f21818599c896359d4556f52fc8b4061feea9b2b727c5157c746eafe522d27bf6e62777379a7ad0d994149795fb66 Feb 13 07:52:00.156891 systemd[1]: Reached target network.target. Feb 13 07:52:00.156962 ignition[867]: fetch-offline: fetch-offline passed Feb 13 07:52:00.212313 systemd[1]: Starting iscsiuio.service... Feb 13 07:52:00.156965 ignition[867]: POST message to Packet Timeline Feb 13 07:52:00.225923 systemd[1]: Started iscsiuio.service. Feb 13 07:52:00.156969 ignition[867]: POST Status error: resource requires networking Feb 13 07:52:00.248060 systemd[1]: Finished ignition-fetch-offline.service. Feb 13 07:52:00.156999 ignition[867]: Ignition finished successfully Feb 13 07:52:00.304021 systemd[1]: ignition-fetch.service was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 07:52:00.378186 ignition[897]: Ignition 2.14.0 Feb 13 07:52:00.304515 systemd[1]: Starting ignition-kargs.service... Feb 13 07:52:00.378191 ignition[897]: Stage: kargs Feb 13 07:52:00.374605 systemd-networkd[880]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 07:52:00.378257 ignition[897]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 07:52:00.387262 systemd[1]: Starting iscsid.service... Feb 13 07:52:00.378268 ignition[897]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 07:52:00.407822 systemd[1]: Started iscsid.service. Feb 13 07:52:00.380728 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 07:52:00.429171 systemd[1]: Starting dracut-initqueue.service... Feb 13 07:52:00.381384 ignition[897]: kargs: kargs passed Feb 13 07:52:00.443847 systemd[1]: Finished dracut-initqueue.service. Feb 13 07:52:00.381387 ignition[897]: POST message to Packet Timeline Feb 13 07:52:00.459895 systemd[1]: Reached target remote-fs-pre.target. Feb 13 07:52:00.381396 ignition[897]: GET https://metadata.packet.net/metadata: attempt #1 Feb 13 07:52:00.504788 systemd[1]: Reached target remote-cryptsetup.target. Feb 13 07:52:00.382941 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:55589->[::1]:53: read: connection refused Feb 13 07:52:00.526045 systemd[1]: Reached target remote-fs.target. Feb 13 07:52:00.583412 ignition[897]: GET https://metadata.packet.net/metadata: attempt #2 Feb 13 07:52:00.541254 systemd[1]: Starting dracut-pre-mount.service... Feb 13 07:52:00.583890 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:53496->[::1]:53: read: connection refused Feb 13 07:52:00.574885 systemd[1]: Finished dracut-pre-mount.service. Feb 13 07:52:00.612212 systemd-networkd[880]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 07:52:00.641393 systemd-networkd[880]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 07:52:00.671346 systemd-networkd[880]: enp1s0f1np1: Link UP Feb 13 07:52:00.671692 systemd-networkd[880]: enp1s0f1np1: Gained carrier Feb 13 07:52:00.683123 systemd-networkd[880]: enp1s0f0np0: Link UP Feb 13 07:52:00.683477 systemd-networkd[880]: eno2: Link UP Feb 13 07:52:00.683835 systemd-networkd[880]: eno1: Link UP Feb 13 07:52:00.984657 ignition[897]: GET https://metadata.packet.net/metadata: attempt #3 Feb 13 07:52:00.985771 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:51682->[::1]:53: read: connection refused Feb 13 07:52:01.441037 systemd-networkd[880]: enp1s0f0np0: Gained carrier Feb 13 07:52:01.449873 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f0np0: link becomes ready Feb 13 07:52:01.473824 systemd-networkd[880]: enp1s0f0np0: DHCPv4 address 145.40.90.207/31, gateway 145.40.90.206 acquired from 145.40.83.140 Feb 13 07:52:01.786296 ignition[897]: GET https://metadata.packet.net/metadata: attempt #4 Feb 13 07:52:01.787280 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:36646->[::1]:53: read: connection refused Feb 13 07:52:01.868228 systemd-networkd[880]: enp1s0f1np1: Gained IPv6LL Feb 13 07:52:02.700223 systemd-networkd[880]: enp1s0f0np0: Gained IPv6LL Feb 13 07:52:03.388992 ignition[897]: GET https://metadata.packet.net/metadata: attempt #5 Feb 13 07:52:03.390204 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:44491->[::1]:53: read: connection refused Feb 13 07:52:06.593683 ignition[897]: GET https://metadata.packet.net/metadata: attempt #6 Feb 13 07:52:06.632809 ignition[897]: GET result: OK Feb 13 07:52:06.818447 ignition[897]: Ignition finished successfully Feb 13 07:52:06.822997 systemd[1]: Finished ignition-kargs.service. Feb 13 07:52:06.904583 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 07:52:06.904603 kernel: audit: type=1130 audit(1707810726.833:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:06.833000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:06.842620 ignition[927]: Ignition 2.14.0 Feb 13 07:52:06.835994 systemd[1]: Starting ignition-disks.service... Feb 13 07:52:06.842623 ignition[927]: Stage: disks Feb 13 07:52:06.842745 ignition[927]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 07:52:06.842756 ignition[927]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 07:52:06.844986 ignition[927]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 07:52:06.845672 ignition[927]: disks: disks passed Feb 13 07:52:06.845675 ignition[927]: POST message to Packet Timeline Feb 13 07:52:06.845685 ignition[927]: GET https://metadata.packet.net/metadata: attempt #1 Feb 13 07:52:06.868450 ignition[927]: GET result: OK Feb 13 07:52:07.086723 ignition[927]: Ignition finished successfully Feb 13 07:52:07.089624 systemd[1]: Finished ignition-disks.service. Feb 13 07:52:07.102000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.103161 systemd[1]: Reached target initrd-root-device.target. Feb 13 07:52:07.177927 kernel: audit: type=1130 audit(1707810727.102:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.163871 systemd[1]: Reached target local-fs-pre.target. Feb 13 07:52:07.163905 systemd[1]: Reached target local-fs.target. Feb 13 07:52:07.186857 systemd[1]: Reached target sysinit.target. Feb 13 07:52:07.200848 systemd[1]: Reached target basic.target. Feb 13 07:52:07.215536 systemd[1]: Starting systemd-fsck-root.service... Feb 13 07:52:07.235701 systemd-fsck[942]: ROOT: clean, 602/553520 files, 56013/553472 blocks Feb 13 07:52:07.249008 systemd[1]: Finished systemd-fsck-root.service. Feb 13 07:52:07.337005 kernel: audit: type=1130 audit(1707810727.257:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.337019 kernel: EXT4-fs (sdb9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Feb 13 07:52:07.257000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.263420 systemd[1]: Mounting sysroot.mount... Feb 13 07:52:07.344251 systemd[1]: Mounted sysroot.mount. Feb 13 07:52:07.357895 systemd[1]: Reached target initrd-root-fs.target. Feb 13 07:52:07.365461 systemd[1]: Mounting sysroot-usr.mount... Feb 13 07:52:07.390464 systemd[1]: Starting flatcar-metadata-hostname.service... Feb 13 07:52:07.399147 systemd[1]: Starting flatcar-static-network.service... Feb 13 07:52:07.407777 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 07:52:07.407804 systemd[1]: Reached target ignition-diskful.target. Feb 13 07:52:07.433732 systemd[1]: Mounted sysroot-usr.mount. Feb 13 07:52:07.458673 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 13 07:52:07.521753 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sdb6 scanned by mount (955) Feb 13 07:52:07.521771 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Feb 13 07:52:07.471556 systemd[1]: Starting initrd-setup-root.service... Feb 13 07:52:07.574823 kernel: BTRFS info (device sdb6): using free space tree Feb 13 07:52:07.574840 kernel: BTRFS info (device sdb6): has skinny extents Feb 13 07:52:07.574851 initrd-setup-root[960]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 07:52:07.661893 kernel: BTRFS info (device sdb6): enabling ssd optimizations Feb 13 07:52:07.661910 kernel: audit: type=1130 audit(1707810727.595:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.595000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.529142 systemd[1]: Finished initrd-setup-root.service. Feb 13 07:52:07.676755 coreos-metadata[950]: Feb 13 07:52:07.553 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 07:52:07.676755 coreos-metadata[950]: Feb 13 07:52:07.595 INFO Fetch successful Feb 13 07:52:07.696871 coreos-metadata[949]: Feb 13 07:52:07.553 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 07:52:07.696871 coreos-metadata[949]: Feb 13 07:52:07.597 INFO Fetch successful Feb 13 07:52:07.696871 coreos-metadata[949]: Feb 13 07:52:07.614 INFO wrote hostname ci-3510.3.2-a-bf0bde3476 to /sysroot/etc/hostname Feb 13 07:52:07.909766 kernel: audit: type=1130 audit(1707810727.711:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.909787 kernel: audit: type=1130 audit(1707810727.774:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.909795 kernel: audit: type=1131 audit(1707810727.774:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.711000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.774000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.774000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.909863 initrd-setup-root[968]: cut: /sysroot/etc/group: No such file or directory Feb 13 07:52:07.917000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.596091 systemd[1]: Starting ignition-mount.service... Feb 13 07:52:07.982874 kernel: audit: type=1130 audit(1707810727.917:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:07.982959 initrd-setup-root[976]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 07:52:07.669307 systemd[1]: Starting sysroot-boot.service... Feb 13 07:52:07.999881 bash[1020]: umount: /sysroot/usr/share/oem: not mounted. Feb 13 07:52:08.008904 initrd-setup-root[984]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 07:52:07.684765 systemd[1]: Finished flatcar-metadata-hostname.service. Feb 13 07:52:08.026854 ignition[1027]: INFO : Ignition 2.14.0 Feb 13 07:52:08.026854 ignition[1027]: INFO : Stage: mount Feb 13 07:52:08.026854 ignition[1027]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 07:52:08.026854 ignition[1027]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 07:52:08.026854 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 07:52:08.026854 ignition[1027]: INFO : mount: mount passed Feb 13 07:52:08.026854 ignition[1027]: INFO : POST message to Packet Timeline Feb 13 07:52:08.026854 ignition[1027]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 07:52:08.026854 ignition[1027]: INFO : GET result: OK Feb 13 07:52:07.713049 systemd[1]: flatcar-static-network.service: Deactivated successfully. Feb 13 07:52:07.713102 systemd[1]: Finished flatcar-static-network.service. Feb 13 07:52:07.774890 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 13 07:52:07.896174 systemd[1]: Finished sysroot-boot.service. Feb 13 07:52:08.155152 ignition[1027]: INFO : Ignition finished successfully Feb 13 07:52:08.157962 systemd[1]: Finished ignition-mount.service. Feb 13 07:52:08.171000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:08.173759 systemd[1]: Starting ignition-files.service... Feb 13 07:52:08.242760 kernel: audit: type=1130 audit(1707810728.171:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:08.242826 ignition[1042]: INFO : Ignition 2.14.0 Feb 13 07:52:08.242826 ignition[1042]: INFO : Stage: files Feb 13 07:52:08.242826 ignition[1042]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 07:52:08.242826 ignition[1042]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 07:52:08.248110 unknown[1042]: wrote ssh authorized keys file for user: core Feb 13 07:52:08.294809 ignition[1042]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 07:52:08.294809 ignition[1042]: DEBUG : files: compiled without relabeling support, skipping Feb 13 07:52:08.294809 ignition[1042]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 07:52:08.294809 ignition[1042]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 07:52:08.294809 ignition[1042]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 07:52:08.294809 ignition[1042]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 07:52:08.294809 ignition[1042]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 07:52:08.294809 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 07:52:08.294809 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 13 07:52:08.294809 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 07:52:08.420950 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 07:52:08.420950 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.3.0.tgz" Feb 13 07:52:08.420950 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://github.com/containernetworking/plugins/releases/download/v1.3.0/cni-plugins-linux-amd64-v1.3.0.tgz: attempt #1 Feb 13 07:52:08.812024 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Feb 13 07:52:08.909718 ignition[1042]: DEBUG : files: createFilesystemsFiles: createFiles: op(4): file matches expected sum of: 5d0324ca8a3c90c680b6e1fddb245a2255582fa15949ba1f3c6bb7323df9d3af754dae98d6e40ac9ccafb2999c932df2c4288d418949a4915d928eb23c090540 Feb 13 07:52:08.909718 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.3.0.tgz" Feb 13 07:52:08.952862 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/opt/crictl-v1.27.0-linux-amd64.tar.gz" Feb 13 07:52:08.952862 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET https://github.com/kubernetes-sigs/cri-tools/releases/download/v1.27.0/crictl-v1.27.0-linux-amd64.tar.gz: attempt #1 Feb 13 07:52:09.317446 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET result: OK Feb 13 07:52:09.372017 ignition[1042]: DEBUG : files: createFilesystemsFiles: createFiles: op(5): file matches expected sum of: aa622325bf05520939f9e020d7a28ab48ac23e2fae6f47d5a4e52174c88c1ebc31b464853e4fd65bd8f5331f330a6ca96fd370d247d3eeaed042da4ee2d1219a Feb 13 07:52:09.372017 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/opt/crictl-v1.27.0-linux-amd64.tar.gz" Feb 13 07:52:09.414750 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/bin/kubeadm" Feb 13 07:52:09.414750 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://dl.k8s.io/release/v1.27.2/bin/linux/amd64/kubeadm: attempt #1 Feb 13 07:52:09.491408 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 13 07:52:09.642486 ignition[1042]: DEBUG : files: createFilesystemsFiles: createFiles: op(6): file matches expected sum of: f40216b7d14046931c58072d10c7122934eac5a23c08821371f8b08ac1779443ad11d3458a4c5dcde7cf80fc600a9fefb14b1942aa46a52330248d497ca88836 Feb 13 07:52:09.642486 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/bin/kubeadm" Feb 13 07:52:09.642486 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/opt/bin/kubelet" Feb 13 07:52:09.699725 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET https://dl.k8s.io/release/v1.27.2/bin/linux/amd64/kubelet: attempt #1 Feb 13 07:52:09.699725 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET result: OK Feb 13 07:52:10.014677 ignition[1042]: DEBUG : files: createFilesystemsFiles: createFiles: op(7): file matches expected sum of: a283da2224d456958b2cb99b4f6faf4457c4ed89e9e95f37d970c637f6a7f64ff4dd4d2bfce538759b2d2090933bece599a285ef8fd132eb383fece9a3941560 Feb 13 07:52:10.039926 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/opt/bin/kubelet" Feb 13 07:52:10.039926 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/opt/bin/kubectl" Feb 13 07:52:10.039926 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET https://dl.k8s.io/release/v1.27.2/bin/linux/amd64/kubectl: attempt #1 Feb 13 07:52:10.089791 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET result: OK Feb 13 07:52:10.228583 ignition[1042]: DEBUG : files: createFilesystemsFiles: createFiles: op(8): file matches expected sum of: 857e67001e74840518413593d90c6e64ad3f00d55fa44ad9a8e2ed6135392c908caff7ec19af18cbe10784b8f83afe687a0bc3bacbc9eee984cdeb9c0749cb83 Feb 13 07:52:10.262794 kernel: BTRFS info: devid 1 device path /dev/sdb6 changed to /dev/disk/by-label/OEM scanned by ignition (1042) Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/opt/bin/kubectl" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/docker/daemon.json" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/docker/daemon.json" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/home/core/install.sh" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(c): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(c): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(d): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(d): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(e): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(e): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(f): [started] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(f): oem config not found in "/usr/share/oem", looking on oem partition Feb 13 07:52:10.262840 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(10): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem1745539687" Feb 13 07:52:10.577892 kernel: audit: type=1130 audit(1707810730.493:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.493000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.481625 systemd[1]: Finished ignition-files.service. Feb 13 07:52:10.591844 ignition[1042]: CRITICAL : files: createFilesystemsFiles: createFiles: op(f): op(10): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem1745539687": device or resource busy Feb 13 07:52:10.591844 ignition[1042]: ERROR : files: createFilesystemsFiles: createFiles: op(f): failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem1745539687", trying btrfs: device or resource busy Feb 13 07:52:10.591844 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(11): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem1745539687" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(11): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem1745539687" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(12): [started] unmounting "/mnt/oem1745539687" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(12): [finished] unmounting "/mnt/oem1745539687" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(f): [finished] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(13): [started] processing unit "coreos-metadata-sshkeys@.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(13): [finished] processing unit "coreos-metadata-sshkeys@.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(14): [started] processing unit "packet-phone-home.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(14): [finished] processing unit "packet-phone-home.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(15): [started] processing unit "prepare-cni-plugins.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(15): op(16): [started] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(15): op(16): [finished] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(15): [finished] processing unit "prepare-cni-plugins.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(17): [started] processing unit "prepare-critools.service" Feb 13 07:52:10.591844 ignition[1042]: INFO : files: op(17): op(18): [started] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 13 07:52:10.591000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.626000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.626000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.702000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.702000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.789000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.900000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.499528 systemd[1]: Starting initrd-setup-root-after-ignition.service... Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(17): op(18): [finished] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(17): [finished] processing unit "prepare-critools.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(19): [started] processing unit "prepare-helm.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(19): op(1a): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(19): op(1a): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(19): [finished] processing unit "prepare-helm.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1b): [started] setting preset to enabled for "prepare-helm.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1b): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1c): [started] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1c): [finished] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1d): [started] setting preset to enabled for "packet-phone-home.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1d): [finished] setting preset to enabled for "packet-phone-home.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1e): [started] setting preset to enabled for "prepare-cni-plugins.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1e): [finished] setting preset to enabled for "prepare-cni-plugins.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1f): [started] setting preset to enabled for "prepare-critools.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: op(1f): [finished] setting preset to enabled for "prepare-critools.service" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: createResultFile: createFiles: op(20): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: createResultFile: createFiles: op(20): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 07:52:11.006973 ignition[1042]: INFO : files: files passed Feb 13 07:52:11.006973 ignition[1042]: INFO : POST message to Packet Timeline Feb 13 07:52:11.006973 ignition[1042]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 07:52:11.006973 ignition[1042]: INFO : GET result: OK Feb 13 07:52:11.006973 ignition[1042]: INFO : Ignition finished successfully Feb 13 07:52:11.179000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.224000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.248000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.381000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.405000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.431000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.470000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.498581 initrd-setup-root-after-ignition[1075]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 07:52:11.505000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.560884 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Feb 13 07:52:11.527000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.561190 systemd[1]: Starting ignition-quench.service... Feb 13 07:52:11.552000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.584951 systemd[1]: Finished initrd-setup-root-after-ignition.service. Feb 13 07:52:10.591982 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 07:52:10.592021 systemd[1]: Finished ignition-quench.service. Feb 13 07:52:10.626946 systemd[1]: Reached target ignition-complete.target. Feb 13 07:52:10.657758 systemd[1]: Starting initrd-parse-etc.service... Feb 13 07:52:11.627000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.635789 ignition[1090]: INFO : Ignition 2.14.0 Feb 13 07:52:11.635789 ignition[1090]: INFO : Stage: umount Feb 13 07:52:11.635789 ignition[1090]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 07:52:11.635789 ignition[1090]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 07:52:11.635789 ignition[1090]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 07:52:11.635789 ignition[1090]: INFO : umount: umount passed Feb 13 07:52:11.635789 ignition[1090]: INFO : POST message to Packet Timeline Feb 13 07:52:11.635789 ignition[1090]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 07:52:11.635789 ignition[1090]: INFO : GET result: OK Feb 13 07:52:11.643000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.663000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.663000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.700000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.719000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.721000 audit: BPF prog-id=6 op=UNLOAD Feb 13 07:52:11.751000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.770000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.679310 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 07:52:11.785000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.794101 ignition[1090]: INFO : Ignition finished successfully Feb 13 07:52:11.802000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.679367 systemd[1]: Finished initrd-parse-etc.service. Feb 13 07:52:10.703059 systemd[1]: Reached target initrd-fs.target. Feb 13 07:52:11.832000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.718052 systemd[1]: Reached target initrd.target. Feb 13 07:52:11.847000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.747189 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Feb 13 07:52:11.941879 kernel: kauditd_printk_skb: 29 callbacks suppressed Feb 13 07:52:11.941896 kernel: audit: type=1131 audit(1707810731.847:70): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.749462 systemd[1]: Starting dracut-pre-pivot.service... Feb 13 07:52:11.950000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.764050 systemd[1]: Finished dracut-pre-pivot.service. Feb 13 07:52:12.023846 kernel: audit: type=1131 audit(1707810731.950:71): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.790654 systemd[1]: Starting initrd-cleanup.service... Feb 13 07:52:12.023000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.820153 systemd[1]: Stopped target nss-lookup.target. Feb 13 07:52:12.101850 kernel: audit: type=1131 audit(1707810732.023:72): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.830248 systemd[1]: Stopped target remote-cryptsetup.target. Feb 13 07:52:10.852372 systemd[1]: Stopped target timers.target. Feb 13 07:52:10.873314 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 07:52:12.131000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.873691 systemd[1]: Stopped dracut-pre-pivot.service. Feb 13 07:52:12.219587 kernel: audit: type=1131 audit(1707810732.131:73): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:12.219600 kernel: audit: type=1131 audit(1707810732.199:74): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:12.199000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.901606 systemd[1]: Stopped target initrd.target. Feb 13 07:52:12.319807 kernel: audit: type=1131 audit(1707810732.258:75): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:12.258000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.928335 systemd[1]: Stopped target basic.target. Feb 13 07:52:10.949318 systemd[1]: Stopped target ignition-complete.target. Feb 13 07:52:12.401715 kernel: audit: type=1131 audit(1707810732.341:76): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:12.341000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.971332 systemd[1]: Stopped target ignition-diskful.target. Feb 13 07:52:12.409000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:10.997303 systemd[1]: Stopped target initrd-root-device.target. Feb 13 07:52:12.539819 kernel: audit: type=1130 audit(1707810732.409:77): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:12.539865 kernel: audit: type=1131 audit(1707810732.409:78): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:12.409000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.017336 systemd[1]: Stopped target remote-fs.target. Feb 13 07:52:11.044297 systemd[1]: Stopped target remote-fs-pre.target. Feb 13 07:52:11.065360 systemd[1]: Stopped target sysinit.target. Feb 13 07:52:11.086321 systemd[1]: Stopped target local-fs.target. Feb 13 07:52:11.112329 systemd[1]: Stopped target local-fs-pre.target. Feb 13 07:52:11.138321 systemd[1]: Stopped target swap.target. Feb 13 07:52:11.158220 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 07:52:11.158574 systemd[1]: Stopped dracut-pre-mount.service. Feb 13 07:52:11.180524 systemd[1]: Stopped target cryptsetup.target. Feb 13 07:52:11.202207 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 07:52:11.202572 systemd[1]: Stopped dracut-initqueue.service. Feb 13 07:52:11.225488 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 07:52:11.225857 systemd[1]: Stopped ignition-fetch-offline.service. Feb 13 07:52:11.249513 systemd[1]: Stopped target paths.target. Feb 13 07:52:11.270192 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 07:52:11.271851 systemd[1]: Stopped systemd-ask-password-console.path. Feb 13 07:52:11.293334 systemd[1]: Stopped target slices.target. Feb 13 07:52:11.315298 systemd[1]: Stopped target sockets.target. Feb 13 07:52:11.337284 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 07:52:11.337526 systemd[1]: Closed iscsid.socket. Feb 13 07:52:11.358387 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 07:52:11.358770 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Feb 13 07:52:12.713000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.382402 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 07:52:12.802847 kernel: audit: type=1131 audit(1707810732.713:79): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:11.382769 systemd[1]: Stopped ignition-files.service. Feb 13 07:52:11.406425 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Feb 13 07:52:11.406802 systemd[1]: Stopped flatcar-metadata-hostname.service. Feb 13 07:52:12.852125 iscsid[909]: iscsid shutting down. Feb 13 07:52:11.434649 systemd[1]: Stopping ignition-mount.service... Feb 13 07:52:11.445895 systemd[1]: Stopping iscsiuio.service... Feb 13 07:52:11.460803 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 07:52:11.460982 systemd[1]: Stopped kmod-static-nodes.service. Feb 13 07:52:11.471816 systemd[1]: Stopping sysroot-boot.service... Feb 13 07:52:11.484134 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 07:52:11.484501 systemd[1]: Stopped systemd-udev-trigger.service. Feb 13 07:52:11.506302 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 07:52:11.506628 systemd[1]: Stopped dracut-pre-trigger.service. Feb 13 07:52:11.536493 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 07:52:11.538275 systemd[1]: iscsiuio.service: Deactivated successfully. Feb 13 07:52:11.538512 systemd[1]: Stopped iscsiuio.service. Feb 13 07:52:11.554424 systemd[1]: Stopped target network.target. Feb 13 07:52:11.567973 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 07:52:11.568073 systemd[1]: Closed iscsiuio.socket. Feb 13 07:52:11.583241 systemd[1]: Stopping systemd-networkd.service... Feb 13 07:52:11.594770 systemd-networkd[880]: enp1s0f0np0: DHCPv6 lease lost Feb 13 07:52:11.598164 systemd[1]: Stopping systemd-resolved.service... Feb 13 07:52:11.602785 systemd-networkd[880]: enp1s0f1np1: DHCPv6 lease lost Feb 13 07:52:11.613282 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 07:52:12.851000 audit: BPF prog-id=9 op=UNLOAD Feb 13 07:52:12.852644 systemd-journald[267]: Received SIGTERM from PID 1 (n/a). Feb 13 07:52:11.613529 systemd[1]: Stopped systemd-resolved.service. Feb 13 07:52:11.628520 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 07:52:11.628564 systemd[1]: Stopped systemd-networkd.service. Feb 13 07:52:11.644054 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 07:52:11.644120 systemd[1]: Finished initrd-cleanup.service. Feb 13 07:52:11.663933 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 07:52:11.663981 systemd[1]: Stopped sysroot-boot.service. Feb 13 07:52:11.701254 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 07:52:11.701379 systemd[1]: Stopped ignition-mount.service. Feb 13 07:52:11.722161 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 07:52:11.722250 systemd[1]: Closed systemd-networkd.socket. Feb 13 07:52:11.736081 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 07:52:11.736229 systemd[1]: Stopped ignition-disks.service. Feb 13 07:52:11.752104 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 07:52:11.752246 systemd[1]: Stopped ignition-kargs.service. Feb 13 07:52:11.771161 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 07:52:11.771309 systemd[1]: Stopped ignition-setup.service. Feb 13 07:52:11.786139 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 07:52:11.786284 systemd[1]: Stopped initrd-setup-root.service. Feb 13 07:52:11.804955 systemd[1]: Stopping network-cleanup.service... Feb 13 07:52:11.817855 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 07:52:11.818094 systemd[1]: Stopped parse-ip-for-networkd.service. Feb 13 07:52:11.833197 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 07:52:11.833350 systemd[1]: Stopped systemd-sysctl.service. Feb 13 07:52:11.930658 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 07:52:11.930697 systemd[1]: Stopped systemd-modules-load.service. Feb 13 07:52:11.950937 systemd[1]: Stopping systemd-udevd.service... Feb 13 07:52:12.017527 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 07:52:12.017815 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 07:52:12.017873 systemd[1]: Stopped systemd-udevd.service. Feb 13 07:52:12.024290 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 07:52:12.024313 systemd[1]: Closed systemd-udevd-control.socket. Feb 13 07:52:12.093883 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 07:52:12.093902 systemd[1]: Closed systemd-udevd-kernel.socket. Feb 13 07:52:12.110722 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 07:52:12.110744 systemd[1]: Stopped dracut-pre-udev.service. Feb 13 07:52:12.131737 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 07:52:12.151753 systemd[1]: Stopped dracut-cmdline.service. Feb 13 07:52:12.199827 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 07:52:12.199851 systemd[1]: Stopped dracut-cmdline-ask.service. Feb 13 07:52:12.259298 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Feb 13 07:52:12.327814 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 07:52:12.327844 systemd[1]: Stopped systemd-vconsole-setup.service. Feb 13 07:52:12.342019 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 07:52:12.342072 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Feb 13 07:52:12.696209 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 07:52:12.696427 systemd[1]: Stopped network-cleanup.service. Feb 13 07:52:12.714416 systemd[1]: Reached target initrd-switch-root.target. Feb 13 07:52:12.795272 systemd[1]: Starting initrd-switch-root.service... Feb 13 07:52:12.807265 systemd[1]: Switching root. Feb 13 07:52:12.853731 systemd-journald[267]: Journal stopped Feb 13 07:52:16.513065 kernel: SELinux: Class mctp_socket not defined in policy. Feb 13 07:52:16.513078 kernel: SELinux: Class anon_inode not defined in policy. Feb 13 07:52:16.513100 kernel: SELinux: the above unknown classes and permissions will be allowed Feb 13 07:52:16.513119 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 07:52:16.513124 kernel: SELinux: policy capability open_perms=1 Feb 13 07:52:16.513129 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 07:52:16.513149 kernel: SELinux: policy capability always_check_network=0 Feb 13 07:52:16.513155 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 07:52:16.513160 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 07:52:16.513166 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 07:52:16.513171 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 07:52:16.513176 systemd[1]: Successfully loaded SELinux policy in 311.803ms. Feb 13 07:52:16.513183 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 19.719ms. Feb 13 07:52:16.513203 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 13 07:52:16.513211 systemd[1]: Detected architecture x86-64. Feb 13 07:52:16.513217 systemd[1]: Detected first boot. Feb 13 07:52:16.513222 systemd[1]: Hostname set to . Feb 13 07:52:16.513229 systemd[1]: Initializing machine ID from random generator. Feb 13 07:52:16.513234 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Feb 13 07:52:16.513240 systemd[1]: Populated /etc with preset unit settings. Feb 13 07:52:16.513246 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 07:52:16.513253 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 07:52:16.513260 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 07:52:16.513266 systemd[1]: iscsid.service: Deactivated successfully. Feb 13 07:52:16.513288 systemd[1]: Stopped iscsid.service. Feb 13 07:52:16.513293 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 07:52:16.513300 systemd[1]: Stopped initrd-switch-root.service. Feb 13 07:52:16.513307 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 07:52:16.513313 systemd[1]: Created slice system-addon\x2dconfig.slice. Feb 13 07:52:16.513319 systemd[1]: Created slice system-addon\x2drun.slice. Feb 13 07:52:16.513325 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice. Feb 13 07:52:16.513331 systemd[1]: Created slice system-getty.slice. Feb 13 07:52:16.513336 systemd[1]: Created slice system-modprobe.slice. Feb 13 07:52:16.513342 systemd[1]: Created slice system-serial\x2dgetty.slice. Feb 13 07:52:16.513348 systemd[1]: Created slice system-system\x2dcloudinit.slice. Feb 13 07:52:16.513354 systemd[1]: Created slice system-systemd\x2dfsck.slice. Feb 13 07:52:16.513360 systemd[1]: Created slice user.slice. Feb 13 07:52:16.513366 systemd[1]: Started systemd-ask-password-console.path. Feb 13 07:52:16.513372 systemd[1]: Started systemd-ask-password-wall.path. Feb 13 07:52:16.513378 systemd[1]: Set up automount boot.automount. Feb 13 07:52:16.513386 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Feb 13 07:52:16.513392 systemd[1]: Stopped target initrd-switch-root.target. Feb 13 07:52:16.513398 systemd[1]: Stopped target initrd-fs.target. Feb 13 07:52:16.513404 systemd[1]: Stopped target initrd-root-fs.target. Feb 13 07:52:16.513411 systemd[1]: Reached target integritysetup.target. Feb 13 07:52:16.513417 systemd[1]: Reached target remote-cryptsetup.target. Feb 13 07:52:16.513423 systemd[1]: Reached target remote-fs.target. Feb 13 07:52:16.513429 systemd[1]: Reached target slices.target. Feb 13 07:52:16.513435 systemd[1]: Reached target swap.target. Feb 13 07:52:16.513442 systemd[1]: Reached target torcx.target. Feb 13 07:52:16.513448 systemd[1]: Reached target veritysetup.target. Feb 13 07:52:16.513454 systemd[1]: Listening on systemd-coredump.socket. Feb 13 07:52:16.513461 systemd[1]: Listening on systemd-initctl.socket. Feb 13 07:52:16.513468 systemd[1]: Listening on systemd-networkd.socket. Feb 13 07:52:16.513474 systemd[1]: Listening on systemd-udevd-control.socket. Feb 13 07:52:16.513480 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 13 07:52:16.513486 systemd[1]: Listening on systemd-userdbd.socket. Feb 13 07:52:16.513492 systemd[1]: Mounting dev-hugepages.mount... Feb 13 07:52:16.513499 systemd[1]: Mounting dev-mqueue.mount... Feb 13 07:52:16.513506 systemd[1]: Mounting media.mount... Feb 13 07:52:16.513512 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 07:52:16.513518 systemd[1]: Mounting sys-kernel-debug.mount... Feb 13 07:52:16.513524 systemd[1]: Mounting sys-kernel-tracing.mount... Feb 13 07:52:16.513531 systemd[1]: Mounting tmp.mount... Feb 13 07:52:16.513537 systemd[1]: Starting flatcar-tmpfiles.service... Feb 13 07:52:16.513543 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Feb 13 07:52:16.513549 systemd[1]: Starting kmod-static-nodes.service... Feb 13 07:52:16.513556 systemd[1]: Starting modprobe@configfs.service... Feb 13 07:52:16.513563 systemd[1]: Starting modprobe@dm_mod.service... Feb 13 07:52:16.513569 systemd[1]: Starting modprobe@drm.service... Feb 13 07:52:16.513575 systemd[1]: Starting modprobe@efi_pstore.service... Feb 13 07:52:16.513582 systemd[1]: Starting modprobe@fuse.service... Feb 13 07:52:16.513588 kernel: fuse: init (API version 7.34) Feb 13 07:52:16.513594 systemd[1]: Starting modprobe@loop.service... Feb 13 07:52:16.513600 kernel: loop: module loaded Feb 13 07:52:16.513607 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 07:52:16.513613 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 07:52:16.513619 systemd[1]: Stopped systemd-fsck-root.service. Feb 13 07:52:16.513626 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 07:52:16.513635 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 07:52:16.513642 systemd[1]: Stopped systemd-journald.service. Feb 13 07:52:16.513670 systemd[1]: Starting systemd-journald.service... Feb 13 07:52:16.513676 systemd[1]: Starting systemd-modules-load.service... Feb 13 07:52:16.513685 systemd-journald[1242]: Journal started Feb 13 07:52:16.513736 systemd-journald[1242]: Runtime Journal (/run/log/journal/bb662360274843c9b210989846e53039) is 8.0M, max 640.1M, 632.1M free. Feb 13 07:52:13.214000 audit: MAC_POLICY_LOAD auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 07:52:13.519000 audit[1]: AVC avc: denied { integrity } for pid=1 comm="systemd" lockdown_reason="/dev/mem,kmem,port" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 13 07:52:13.521000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 13 07:52:13.521000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 13 07:52:13.522000 audit: BPF prog-id=10 op=LOAD Feb 13 07:52:13.522000 audit: BPF prog-id=10 op=UNLOAD Feb 13 07:52:13.522000 audit: BPF prog-id=11 op=LOAD Feb 13 07:52:13.522000 audit: BPF prog-id=11 op=UNLOAD Feb 13 07:52:13.587000 audit[1130]: AVC avc: denied { associate } for pid=1130 comm="torcx-generator" name="docker" dev="tmpfs" ino=2 scontext=system_u:object_r:unlabeled_t:s0 tcontext=system_u:object_r:tmpfs_t:s0 tclass=filesystem permissive=1 srawcon="system_u:object_r:container_file_t:s0:c1022,c1023" Feb 13 07:52:13.587000 audit[1130]: SYSCALL arch=c000003e syscall=188 success=yes exit=0 a0=c0001278dc a1=c00002ce58 a2=c00002bb00 a3=32 items=0 ppid=1113 pid=1130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="torcx-generator" exe="/usr/lib/systemd/system-generators/torcx-generator" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:13.587000 audit: PROCTITLE proctitle=2F7573722F6C69622F73797374656D642F73797374656D2D67656E657261746F72732F746F7263782D67656E657261746F72002F72756E2F73797374656D642F67656E657261746F72002F72756E2F73797374656D642F67656E657261746F722E6561726C79002F72756E2F73797374656D642F67656E657261746F722E6C61 Feb 13 07:52:13.614000 audit[1130]: AVC avc: denied { associate } for pid=1130 comm="torcx-generator" name="lib" scontext=system_u:object_r:unlabeled_t:s0 tcontext=system_u:object_r:tmpfs_t:s0 tclass=filesystem permissive=1 Feb 13 07:52:13.614000 audit[1130]: SYSCALL arch=c000003e syscall=258 success=yes exit=0 a0=ffffffffffffff9c a1=c0001279b5 a2=1ed a3=0 items=2 ppid=1113 pid=1130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="torcx-generator" exe="/usr/lib/systemd/system-generators/torcx-generator" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:13.614000 audit: CWD cwd="/" Feb 13 07:52:13.614000 audit: PATH item=0 name=(null) inode=2 dev=00:1b mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:unlabeled_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:13.614000 audit: PATH item=1 name=(null) inode=3 dev=00:1b mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:unlabeled_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:13.614000 audit: PROCTITLE proctitle=2F7573722F6C69622F73797374656D642F73797374656D2D67656E657261746F72732F746F7263782D67656E657261746F72002F72756E2F73797374656D642F67656E657261746F72002F72756E2F73797374656D642F67656E657261746F722E6561726C79002F72756E2F73797374656D642F67656E657261746F722E6C61 Feb 13 07:52:15.129000 audit: BPF prog-id=12 op=LOAD Feb 13 07:52:15.129000 audit: BPF prog-id=3 op=UNLOAD Feb 13 07:52:15.129000 audit: BPF prog-id=13 op=LOAD Feb 13 07:52:15.130000 audit: BPF prog-id=14 op=LOAD Feb 13 07:52:15.130000 audit: BPF prog-id=4 op=UNLOAD Feb 13 07:52:15.130000 audit: BPF prog-id=5 op=UNLOAD Feb 13 07:52:15.130000 audit: BPF prog-id=15 op=LOAD Feb 13 07:52:15.130000 audit: BPF prog-id=12 op=UNLOAD Feb 13 07:52:15.130000 audit: BPF prog-id=16 op=LOAD Feb 13 07:52:15.130000 audit: BPF prog-id=17 op=LOAD Feb 13 07:52:15.130000 audit: BPF prog-id=13 op=UNLOAD Feb 13 07:52:15.130000 audit: BPF prog-id=14 op=UNLOAD Feb 13 07:52:15.131000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:15.177000 audit: BPF prog-id=15 op=UNLOAD Feb 13 07:52:15.182000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:15.235000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:15.235000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.428000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.464000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.485000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.485000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.485000 audit: BPF prog-id=18 op=LOAD Feb 13 07:52:16.486000 audit: BPF prog-id=19 op=LOAD Feb 13 07:52:16.486000 audit: BPF prog-id=20 op=LOAD Feb 13 07:52:16.486000 audit: BPF prog-id=16 op=UNLOAD Feb 13 07:52:16.486000 audit: BPF prog-id=17 op=UNLOAD Feb 13 07:52:16.510000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Feb 13 07:52:16.510000 audit[1242]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7fffa9ea6e50 a2=4000 a3=7fffa9ea6eec items=0 ppid=1 pid=1242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:16.510000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Feb 13 07:52:13.585407 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 07:52:15.128299 systemd[1]: Queued start job for default target multi-user.target. Feb 13 07:52:13.585846 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="profile found" name=docker-1.12-no path=/usr/share/torcx/profiles/docker-1.12-no.json Feb 13 07:52:15.131743 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 07:52:13.585858 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="profile found" name=vendor path=/usr/share/torcx/profiles/vendor.json Feb 13 07:52:13.585877 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=info msg="no vendor profile selected by /etc/flatcar/docker-1.12" Feb 13 07:52:13.585883 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="skipped missing lower profile" missing profile=oem Feb 13 07:52:13.585899 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=warning msg="no next profile: unable to read profile file: open /etc/torcx/next-profile: no such file or directory" Feb 13 07:52:13.585906 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="apply configuration parsed" lower profiles (vendor/oem)="[vendor]" upper profile (user)= Feb 13 07:52:13.586019 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="mounted tmpfs" target=/run/torcx/unpack Feb 13 07:52:13.586039 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="profile found" name=docker-1.12-no path=/usr/share/torcx/profiles/docker-1.12-no.json Feb 13 07:52:13.586046 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="profile found" name=vendor path=/usr/share/torcx/profiles/vendor.json Feb 13 07:52:13.586464 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="new archive/reference added to cache" format=tgz name=docker path="/usr/share/torcx/store/docker:20.10.torcx.tgz" reference=20.10 Feb 13 07:52:13.586483 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=debug msg="new archive/reference added to cache" format=tgz name=docker path="/usr/share/torcx/store/docker:com.coreos.cl.torcx.tgz" reference=com.coreos.cl Feb 13 07:52:13.586493 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=info msg="store skipped" err="open /usr/share/oem/torcx/store/3510.3.2: no such file or directory" path=/usr/share/oem/torcx/store/3510.3.2 Feb 13 07:52:13.586500 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=info msg="store skipped" err="open /usr/share/oem/torcx/store: no such file or directory" path=/usr/share/oem/torcx/store Feb 13 07:52:13.586509 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=info msg="store skipped" err="open /var/lib/torcx/store/3510.3.2: no such file or directory" path=/var/lib/torcx/store/3510.3.2 Feb 13 07:52:13.586517 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:13Z" level=info msg="store skipped" err="open /var/lib/torcx/store: no such file or directory" path=/var/lib/torcx/store Feb 13 07:52:14.776547 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:14Z" level=debug msg="image unpacked" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 07:52:14.776694 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:14Z" level=debug msg="binaries propagated" assets="[/bin/containerd /bin/containerd-shim /bin/ctr /bin/docker /bin/docker-containerd /bin/docker-containerd-shim /bin/docker-init /bin/docker-proxy /bin/docker-runc /bin/dockerd /bin/runc /bin/tini]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 07:52:14.776751 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:14Z" level=debug msg="networkd units propagated" assets="[/lib/systemd/network/50-docker.network /lib/systemd/network/90-docker-veth.network]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 07:52:14.776844 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:14Z" level=debug msg="systemd units propagated" assets="[/lib/systemd/system/containerd.service /lib/systemd/system/docker.service /lib/systemd/system/docker.socket /lib/systemd/system/sockets.target.wants /lib/systemd/system/multi-user.target.wants]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 07:52:14.776873 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:14Z" level=debug msg="profile applied" sealed profile=/run/torcx/profile.json upper profile= Feb 13 07:52:14.776906 /usr/lib/systemd/system-generators/torcx-generator[1130]: time="2024-02-13T07:52:14Z" level=debug msg="system state sealed" content="[TORCX_LOWER_PROFILES=\"vendor\" TORCX_UPPER_PROFILE=\"\" TORCX_PROFILE_PATH=\"/run/torcx/profile.json\" TORCX_BINDIR=\"/run/torcx/bin\" TORCX_UNPACKDIR=\"/run/torcx/unpack\"]" path=/run/metadata/torcx Feb 13 07:52:16.544811 systemd[1]: Starting systemd-network-generator.service... Feb 13 07:52:16.567814 systemd[1]: Starting systemd-remount-fs.service... Feb 13 07:52:16.588687 systemd[1]: Starting systemd-udev-trigger.service... Feb 13 07:52:16.621135 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 07:52:16.621156 systemd[1]: Stopped verity-setup.service. Feb 13 07:52:16.627000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.655682 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 07:52:16.669681 systemd[1]: Started systemd-journald.service. Feb 13 07:52:16.677000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.678160 systemd[1]: Mounted dev-hugepages.mount. Feb 13 07:52:16.684900 systemd[1]: Mounted dev-mqueue.mount. Feb 13 07:52:16.691877 systemd[1]: Mounted media.mount. Feb 13 07:52:16.698883 systemd[1]: Mounted sys-kernel-debug.mount. Feb 13 07:52:16.707865 systemd[1]: Mounted sys-kernel-tracing.mount. Feb 13 07:52:16.716854 systemd[1]: Mounted tmp.mount. Feb 13 07:52:16.723914 systemd[1]: Finished flatcar-tmpfiles.service. Feb 13 07:52:16.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.731968 systemd[1]: Finished kmod-static-nodes.service. Feb 13 07:52:16.739000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.739936 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 07:52:16.740028 systemd[1]: Finished modprobe@configfs.service. Feb 13 07:52:16.748000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.748000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.749017 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 07:52:16.749129 systemd[1]: Finished modprobe@dm_mod.service. Feb 13 07:52:16.757000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.757000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.758096 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 07:52:16.758242 systemd[1]: Finished modprobe@drm.service. Feb 13 07:52:16.767000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.767000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.768304 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 07:52:16.768537 systemd[1]: Finished modprobe@efi_pstore.service. Feb 13 07:52:16.776000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.776000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.777435 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 07:52:16.777746 systemd[1]: Finished modprobe@fuse.service. Feb 13 07:52:16.785000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.785000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.786417 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 07:52:16.786723 systemd[1]: Finished modprobe@loop.service. Feb 13 07:52:16.794000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.794000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.795449 systemd[1]: Finished systemd-modules-load.service. Feb 13 07:52:16.803000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.804378 systemd[1]: Finished systemd-network-generator.service. Feb 13 07:52:16.812000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.813403 systemd[1]: Finished systemd-remount-fs.service. Feb 13 07:52:16.821000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.822479 systemd[1]: Finished systemd-udev-trigger.service. Feb 13 07:52:16.830000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.832010 systemd[1]: Reached target network-pre.target. Feb 13 07:52:16.843551 systemd[1]: Mounting sys-fs-fuse-connections.mount... Feb 13 07:52:16.852307 systemd[1]: Mounting sys-kernel-config.mount... Feb 13 07:52:16.859809 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 07:52:16.860836 systemd[1]: Starting systemd-hwdb-update.service... Feb 13 07:52:16.868288 systemd[1]: Starting systemd-journal-flush.service... Feb 13 07:52:16.872123 systemd-journald[1242]: Time spent on flushing to /var/log/journal/bb662360274843c9b210989846e53039 is 14.856ms for 1598 entries. Feb 13 07:52:16.872123 systemd-journald[1242]: System Journal (/var/log/journal/bb662360274843c9b210989846e53039) is 8.0M, max 195.6M, 187.6M free. Feb 13 07:52:16.906703 systemd-journald[1242]: Received client request to flush runtime journal. Feb 13 07:52:16.884750 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 07:52:16.885329 systemd[1]: Starting systemd-random-seed.service... Feb 13 07:52:16.895739 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Feb 13 07:52:16.896258 systemd[1]: Starting systemd-sysctl.service... Feb 13 07:52:16.903193 systemd[1]: Starting systemd-sysusers.service... Feb 13 07:52:16.910228 systemd[1]: Starting systemd-udev-settle.service... Feb 13 07:52:16.918780 systemd[1]: Mounted sys-fs-fuse-connections.mount. Feb 13 07:52:16.926809 systemd[1]: Mounted sys-kernel-config.mount. Feb 13 07:52:16.934861 systemd[1]: Finished systemd-journal-flush.service. Feb 13 07:52:16.942000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:16.942862 systemd[1]: Finished systemd-random-seed.service. Feb 13 07:52:16.956577 kernel: kauditd_printk_skb: 67 callbacks suppressed Feb 13 07:52:16.956602 kernel: audit: type=1130 audit(1707810736.942:138): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.005000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.005861 systemd[1]: Finished systemd-sysctl.service. Feb 13 07:52:17.048690 kernel: audit: type=1130 audit(1707810737.005:139): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.055000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.055870 systemd[1]: Finished systemd-sysusers.service. Feb 13 07:52:17.099694 kernel: audit: type=1130 audit(1707810737.055:140): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.107829 systemd[1]: Reached target first-boot-complete.target. Feb 13 07:52:17.151672 kernel: audit: type=1130 audit(1707810737.106:141): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.160439 udevadm[1258]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 13 07:52:17.175290 systemd[1]: Finished systemd-hwdb-update.service. Feb 13 07:52:17.184000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.230000 audit: BPF prog-id=21 op=LOAD Feb 13 07:52:17.231651 kernel: audit: type=1130 audit(1707810737.184:142): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.231684 kernel: audit: type=1334 audit(1707810737.230:143): prog-id=21 op=LOAD Feb 13 07:52:17.250000 audit: BPF prog-id=22 op=LOAD Feb 13 07:52:17.251609 systemd[1]: Starting systemd-udevd.service... Feb 13 07:52:17.271011 kernel: audit: type=1334 audit(1707810737.250:144): prog-id=22 op=LOAD Feb 13 07:52:17.271033 kernel: audit: type=1334 audit(1707810737.250:145): prog-id=7 op=UNLOAD Feb 13 07:52:17.271044 kernel: audit: type=1334 audit(1707810737.250:146): prog-id=8 op=UNLOAD Feb 13 07:52:17.250000 audit: BPF prog-id=7 op=UNLOAD Feb 13 07:52:17.250000 audit: BPF prog-id=8 op=UNLOAD Feb 13 07:52:17.320209 systemd-udevd[1259]: Using default interface naming scheme 'v252'. Feb 13 07:52:17.340000 systemd[1]: Started systemd-udevd.service. Feb 13 07:52:17.347000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.349873 systemd[1]: Condition check resulted in dev-ttyS1.device being skipped. Feb 13 07:52:17.395689 kernel: audit: type=1130 audit(1707810737.347:147): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.395780 kernel: BTRFS info: devid 1 device path /dev/disk/by-label/OEM changed to /dev/sdb6 scanned by (udev-worker) (1326) Feb 13 07:52:17.417000 audit: BPF prog-id=23 op=LOAD Feb 13 07:52:17.419113 systemd[1]: Starting systemd-networkd.service... Feb 13 07:52:17.457450 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Feb 13 07:52:17.457479 kernel: ACPI: button: Sleep Button [SLPB] Feb 13 07:52:17.457492 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Feb 13 07:52:17.461948 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Feb 13 07:52:17.478638 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 07:52:17.498635 kernel: ACPI: button: Power Button [PWRF] Feb 13 07:52:17.514000 audit: BPF prog-id=24 op=LOAD Feb 13 07:52:17.515000 audit: BPF prog-id=25 op=LOAD Feb 13 07:52:17.515000 audit: BPF prog-id=26 op=LOAD Feb 13 07:52:17.375000 audit[1283]: AVC avc: denied { confidentiality } for pid=1283 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 13 07:52:17.518725 systemd[1]: Starting systemd-userdbd.service... Feb 13 07:52:17.535641 kernel: IPMI message handler: version 39.2 Feb 13 07:52:17.554647 kernel: ipmi device interface Feb 13 07:52:17.375000 audit[1283]: SYSCALL arch=c000003e syscall=175 success=yes exit=0 a0=7f376757a010 a1=4d8bc a2=7f3769213bc5 a3=5 items=42 ppid=1259 pid=1283 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:17.375000 audit: CWD cwd="/" Feb 13 07:52:17.375000 audit: PATH item=0 name=(null) inode=45 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=1 name=(null) inode=13530 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=2 name=(null) inode=13530 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=3 name=(null) inode=13531 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=4 name=(null) inode=13530 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=5 name=(null) inode=13532 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=6 name=(null) inode=13530 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=7 name=(null) inode=13533 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=8 name=(null) inode=13533 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=9 name=(null) inode=13534 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=10 name=(null) inode=13533 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=11 name=(null) inode=13535 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=12 name=(null) inode=13533 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=13 name=(null) inode=13536 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=14 name=(null) inode=13533 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=15 name=(null) inode=13537 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=16 name=(null) inode=13533 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=17 name=(null) inode=13538 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=18 name=(null) inode=13530 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=19 name=(null) inode=13539 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=20 name=(null) inode=13539 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=21 name=(null) inode=13540 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=22 name=(null) inode=13539 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=23 name=(null) inode=13541 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=24 name=(null) inode=13539 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=25 name=(null) inode=13542 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=26 name=(null) inode=13539 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=27 name=(null) inode=13543 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=28 name=(null) inode=13539 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=29 name=(null) inode=13544 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=30 name=(null) inode=13530 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=31 name=(null) inode=13545 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=32 name=(null) inode=13545 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=33 name=(null) inode=13546 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=34 name=(null) inode=13545 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=35 name=(null) inode=13547 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=36 name=(null) inode=13545 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=37 name=(null) inode=13548 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=38 name=(null) inode=13545 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=39 name=(null) inode=13549 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=40 name=(null) inode=13545 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PATH item=41 name=(null) inode=13550 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 07:52:17.375000 audit: PROCTITLE proctitle="(udev-worker)" Feb 13 07:52:17.609828 kernel: ipmi_si: IPMI System Interface driver Feb 13 07:52:17.609950 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Feb 13 07:52:17.610061 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Feb 13 07:52:17.630589 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Feb 13 07:52:17.630621 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Feb 13 07:52:17.690022 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Feb 13 07:52:17.714589 systemd[1]: Started systemd-userdbd.service. Feb 13 07:52:17.729487 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Feb 13 07:52:17.729734 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Feb 13 07:52:17.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:17.752640 kernel: i2c i2c-0: 1/4 memory slots populated (from DMI) Feb 13 07:52:17.762653 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Feb 13 07:52:17.762798 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Feb 13 07:52:17.762908 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Feb 13 07:52:17.762999 kernel: ipmi_si: Adding ACPI-specified kcs state machine Feb 13 07:52:17.763017 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Feb 13 07:52:17.841638 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Feb 13 07:52:17.922643 kernel: iTCO_vendor_support: vendor-support=0 Feb 13 07:52:17.947339 systemd-networkd[1338]: bond0: netdev ready Feb 13 07:52:17.949750 systemd-networkd[1338]: lo: Link UP Feb 13 07:52:17.949753 systemd-networkd[1338]: lo: Gained carrier Feb 13 07:52:17.950302 systemd-networkd[1338]: Enumeration completed Feb 13 07:52:17.950620 systemd-networkd[1338]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Feb 13 07:52:17.951283 systemd-networkd[1338]: enp1s0f1np1: Configuring with /etc/systemd/network/10-b8:59:9f:de:84:bd.network. Feb 13 07:52:17.953027 systemd[1]: Started systemd-networkd.service. Feb 13 07:52:17.977608 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Feb 13 07:52:17.977721 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Feb 13 07:52:17.977793 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Feb 13 07:52:18.011000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:18.041893 kernel: intel_rapl_common: Found RAPL domain package Feb 13 07:52:18.041920 kernel: intel_rapl_common: Found RAPL domain core Feb 13 07:52:18.041931 kernel: intel_rapl_common: Found RAPL domain dram Feb 13 07:52:18.060322 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Feb 13 07:52:18.115665 kernel: ipmi_ssif: IPMI SSIF Interface driver Feb 13 07:52:18.115689 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 07:52:18.156524 systemd-networkd[1338]: enp1s0f0np0: Configuring with /etc/systemd/network/10-b8:59:9f:de:84:bc.network. Feb 13 07:52:18.156635 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Feb 13 07:52:18.156655 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 07:52:18.304669 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 07:52:18.569752 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 13 07:52:18.594713 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Feb 13 07:52:18.594783 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): bond0: link becomes ready Feb 13 07:52:18.614121 systemd-networkd[1338]: bond0: Link UP Feb 13 07:52:18.614317 systemd-networkd[1338]: enp1s0f1np1: Link UP Feb 13 07:52:18.614433 systemd-networkd[1338]: enp1s0f1np1: Gained carrier Feb 13 07:52:18.615391 systemd-networkd[1338]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-b8:59:9f:de:84:bc.network. Feb 13 07:52:18.650797 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Feb 13 07:52:18.650819 kernel: bond0: active interface up! Feb 13 07:52:18.672685 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 25000 Mbps full duplex Feb 13 07:52:18.689925 systemd[1]: Finished systemd-udev-settle.service. Feb 13 07:52:18.698000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:18.699457 systemd[1]: Starting lvm2-activation-early.service... Feb 13 07:52:18.715443 lvm[1363]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 07:52:18.747054 systemd[1]: Finished lvm2-activation-early.service. Feb 13 07:52:18.754000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:18.754713 systemd[1]: Reached target cryptsetup.target. Feb 13 07:52:18.764327 systemd[1]: Starting lvm2-activation.service... Feb 13 07:52:18.766434 lvm[1364]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 07:52:18.799639 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.821674 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.843689 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.865679 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.866106 systemd[1]: Finished lvm2-activation.service. Feb 13 07:52:18.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:18.883782 systemd[1]: Reached target local-fs-pre.target. Feb 13 07:52:18.888678 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.904741 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 07:52:18.904761 systemd[1]: Reached target local-fs.target. Feb 13 07:52:18.909636 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.925749 systemd[1]: Reached target machines.target. Feb 13 07:52:18.930650 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.949430 systemd[1]: Starting ldconfig.service... Feb 13 07:52:18.951640 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.967802 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Feb 13 07:52:18.967832 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 07:52:18.968454 systemd[1]: Starting systemd-boot-update.service... Feb 13 07:52:18.972644 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:18.988177 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Feb 13 07:52:18.993692 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:19.012217 systemd[1]: Starting systemd-machine-id-commit.service... Feb 13 07:52:19.013417 systemd[1]: systemd-sysext.service was skipped because no trigger condition checks were met. Feb 13 07:52:19.013446 systemd[1]: ensure-sysext.service was skipped because no trigger condition checks were met. Feb 13 07:52:19.013637 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:19.013978 systemd[1]: Starting systemd-tmpfiles-setup.service... Feb 13 07:52:19.014218 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1367 (bootctl) Feb 13 07:52:19.014867 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Feb 13 07:52:19.033637 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:19.052034 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Feb 13 07:52:19.052000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:19.053671 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:19.072662 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:19.091636 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:19.109666 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:19.110182 systemd-networkd[1338]: enp1s0f0np0: Link UP Feb 13 07:52:19.110339 systemd-networkd[1338]: bond0: Gained carrier Feb 13 07:52:19.110425 systemd-networkd[1338]: enp1s0f0np0: Gained carrier Feb 13 07:52:19.141856 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 07:52:19.141881 kernel: bond0: (slave enp1s0f1np1): invalid new link 1 on slave Feb 13 07:52:19.142936 systemd-networkd[1338]: enp1s0f1np1: Link DOWN Feb 13 07:52:19.142939 systemd-networkd[1338]: enp1s0f1np1: Lost carrier Feb 13 07:52:19.197968 systemd-tmpfiles[1371]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Feb 13 07:52:19.310668 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 07:52:19.327670 kernel: bond0: (slave enp1s0f1np1): speed changed to 0 on port 1 Feb 13 07:52:19.329377 systemd-networkd[1338]: enp1s0f1np1: Link UP Feb 13 07:52:19.329528 systemd-networkd[1338]: enp1s0f1np1: Gained carrier Feb 13 07:52:19.362665 kernel: bond0: (slave enp1s0f1np1): link status up again after 200 ms Feb 13 07:52:19.379692 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Feb 13 07:52:19.519587 systemd-tmpfiles[1371]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 07:52:19.534532 systemd-tmpfiles[1371]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 07:52:19.537122 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 07:52:19.537428 systemd[1]: Finished systemd-machine-id-commit.service. Feb 13 07:52:19.536000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:19.570421 systemd-fsck[1375]: fsck.fat 4.2 (2021-01-31) Feb 13 07:52:19.570421 systemd-fsck[1375]: /dev/sdb1: 789 files, 115339/258078 clusters Feb 13 07:52:19.571151 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Feb 13 07:52:19.580000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:19.582688 systemd[1]: Mounting boot.mount... Feb 13 07:52:19.593841 systemd[1]: Mounted boot.mount. Feb 13 07:52:19.611960 systemd[1]: Finished systemd-boot-update.service. Feb 13 07:52:19.619000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:19.641525 systemd[1]: Finished systemd-tmpfiles-setup.service. Feb 13 07:52:19.649000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:19.650469 systemd[1]: Starting audit-rules.service... Feb 13 07:52:19.657245 systemd[1]: Starting clean-ca-certificates.service... Feb 13 07:52:19.666317 systemd[1]: Starting systemd-journal-catalog-update.service... Feb 13 07:52:19.670000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Feb 13 07:52:19.670000 audit[1396]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc5b507bb0 a2=420 a3=0 items=0 ppid=1379 pid=1396 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:19.670000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Feb 13 07:52:19.671657 augenrules[1396]: No rules Feb 13 07:52:19.675603 systemd[1]: Starting systemd-resolved.service... Feb 13 07:52:19.684585 systemd[1]: Starting systemd-timesyncd.service... Feb 13 07:52:19.693189 systemd[1]: Starting systemd-update-utmp.service... Feb 13 07:52:19.699987 systemd[1]: Finished audit-rules.service. Feb 13 07:52:19.707796 systemd[1]: Finished clean-ca-certificates.service. Feb 13 07:52:19.716776 systemd[1]: Finished systemd-journal-catalog-update.service. Feb 13 07:52:19.722378 ldconfig[1365]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 07:52:19.726786 systemd[1]: Finished ldconfig.service. Feb 13 07:52:19.735379 systemd[1]: Finished systemd-update-utmp.service. Feb 13 07:52:19.745298 systemd[1]: Starting systemd-update-done.service... Feb 13 07:52:19.751679 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 07:52:19.751865 systemd[1]: Finished systemd-update-done.service. Feb 13 07:52:19.760822 systemd[1]: Started systemd-timesyncd.service. Feb 13 07:52:19.761991 systemd-resolved[1401]: Positive Trust Anchors: Feb 13 07:52:19.761999 systemd-resolved[1401]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 07:52:19.762017 systemd-resolved[1401]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 13 07:52:19.765660 systemd-resolved[1401]: Using system hostname 'ci-3510.3.2-a-bf0bde3476'. Feb 13 07:52:19.769751 systemd[1]: Started systemd-resolved.service. Feb 13 07:52:19.779067 systemd[1]: Reached target network.target. Feb 13 07:52:19.787718 systemd[1]: Reached target nss-lookup.target. Feb 13 07:52:19.795722 systemd[1]: Reached target sysinit.target. Feb 13 07:52:19.803758 systemd[1]: Started motdgen.path. Feb 13 07:52:19.810733 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Feb 13 07:52:19.820706 systemd[1]: Started systemd-tmpfiles-clean.timer. Feb 13 07:52:19.828700 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 07:52:19.828716 systemd[1]: Reached target paths.target. Feb 13 07:52:19.835703 systemd[1]: Reached target time-set.target. Feb 13 07:52:19.844149 systemd[1]: Started logrotate.timer. Feb 13 07:52:19.850771 systemd[1]: Started mdadm.timer. Feb 13 07:52:19.857707 systemd[1]: Reached target timers.target. Feb 13 07:52:19.864830 systemd[1]: Listening on dbus.socket. Feb 13 07:52:19.872241 systemd[1]: Starting docker.socket... Feb 13 07:52:19.880078 systemd[1]: Listening on sshd.socket. Feb 13 07:52:19.886775 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 07:52:19.886991 systemd[1]: Listening on docker.socket. Feb 13 07:52:19.893765 systemd[1]: Reached target sockets.target. Feb 13 07:52:19.901719 systemd[1]: Reached target basic.target. Feb 13 07:52:19.908731 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 13 07:52:19.908747 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 13 07:52:19.909221 systemd[1]: Starting containerd.service... Feb 13 07:52:19.916164 systemd[1]: Starting coreos-metadata-sshkeys@core.service... Feb 13 07:52:19.925187 systemd[1]: Starting coreos-metadata.service... Feb 13 07:52:19.932198 systemd[1]: Starting dbus.service... Feb 13 07:52:19.938394 systemd[1]: Starting enable-oem-cloudinit.service... Feb 13 07:52:19.943798 jq[1416]: false Feb 13 07:52:19.945271 systemd[1]: Starting extend-filesystems.service... Feb 13 07:52:19.946309 coreos-metadata[1409]: Feb 13 07:52:19.946 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 07:52:19.950869 dbus-daemon[1415]: [system] SELinux support is enabled Feb 13 07:52:19.951723 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Feb 13 07:52:19.952333 systemd[1]: Starting motdgen.service... Feb 13 07:52:19.953550 extend-filesystems[1417]: Found sda Feb 13 07:52:19.965745 extend-filesystems[1417]: Found sdb Feb 13 07:52:19.965745 extend-filesystems[1417]: Found sdb1 Feb 13 07:52:19.965745 extend-filesystems[1417]: Found sdb2 Feb 13 07:52:19.965745 extend-filesystems[1417]: Found sdb3 Feb 13 07:52:19.965745 extend-filesystems[1417]: Found usr Feb 13 07:52:19.965745 extend-filesystems[1417]: Found sdb4 Feb 13 07:52:19.965745 extend-filesystems[1417]: Found sdb6 Feb 13 07:52:19.965745 extend-filesystems[1417]: Found sdb7 Feb 13 07:52:19.965745 extend-filesystems[1417]: Found sdb9 Feb 13 07:52:19.965745 extend-filesystems[1417]: Checking size of /dev/sdb9 Feb 13 07:52:19.965745 extend-filesystems[1417]: Resized partition /dev/sdb9 Feb 13 07:52:20.088734 kernel: EXT4-fs (sdb9): resizing filesystem from 553472 to 116605649 blocks Feb 13 07:52:20.088775 coreos-metadata[1412]: Feb 13 07:52:19.955 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 07:52:19.959567 systemd[1]: Starting prepare-cni-plugins.service... Feb 13 07:52:20.088920 extend-filesystems[1433]: resize2fs 1.46.5 (30-Dec-2021) Feb 13 07:52:19.980712 systemd[1]: Starting prepare-critools.service... Feb 13 07:52:19.995341 systemd[1]: Starting prepare-helm.service... Feb 13 07:52:20.014285 systemd[1]: Starting ssh-key-proc-cmdline.service... Feb 13 07:52:20.033198 systemd[1]: Starting sshd-keygen.service... Feb 13 07:52:20.052020 systemd[1]: Starting systemd-logind.service... Feb 13 07:52:20.057722 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 07:52:20.058219 systemd[1]: Starting tcsd.service... Feb 13 07:52:20.077165 systemd-logind[1446]: Watching system buttons on /dev/input/event3 (Power Button) Feb 13 07:52:20.105485 jq[1449]: true Feb 13 07:52:20.077174 systemd-logind[1446]: Watching system buttons on /dev/input/event2 (Sleep Button) Feb 13 07:52:20.077184 systemd-logind[1446]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Feb 13 07:52:20.077311 systemd-logind[1446]: New seat seat0. Feb 13 07:52:20.081077 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 07:52:20.081437 systemd[1]: Starting update-engine.service... Feb 13 07:52:20.097258 systemd[1]: Starting update-ssh-keys-after-ignition.service... Feb 13 07:52:20.113038 systemd[1]: Started dbus.service. Feb 13 07:52:20.122437 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 07:52:20.122556 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Feb 13 07:52:20.122749 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 07:52:20.122845 systemd[1]: Finished motdgen.service. Feb 13 07:52:20.125311 update_engine[1448]: I0213 07:52:20.124902 1448 main.cc:92] Flatcar Update Engine starting Feb 13 07:52:20.128162 update_engine[1448]: I0213 07:52:20.128113 1448 update_check_scheduler.cc:74] Next update check in 8m53s Feb 13 07:52:20.130849 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 07:52:20.130953 systemd[1]: Finished ssh-key-proc-cmdline.service. Feb 13 07:52:20.134913 tar[1451]: ./ Feb 13 07:52:20.134913 tar[1451]: ./loopback Feb 13 07:52:20.142341 jq[1457]: true Feb 13 07:52:20.142848 dbus-daemon[1415]: [system] Successfully activated service 'org.freedesktop.systemd1' Feb 13 07:52:20.143895 tar[1452]: crictl Feb 13 07:52:20.145564 tar[1453]: linux-amd64/helm Feb 13 07:52:20.148582 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Feb 13 07:52:20.148713 systemd[1]: Condition check resulted in tcsd.service being skipped. Feb 13 07:52:20.149941 systemd[1]: Started update-engine.service. Feb 13 07:52:20.153437 tar[1451]: ./bandwidth Feb 13 07:52:20.154062 env[1458]: time="2024-02-13T07:52:20.154008301Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Feb 13 07:52:20.162588 env[1458]: time="2024-02-13T07:52:20.162568115Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 07:52:20.162825 systemd[1]: Started systemd-logind.service. Feb 13 07:52:20.163337 env[1458]: time="2024-02-13T07:52:20.163320996Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 07:52:20.163970 env[1458]: time="2024-02-13T07:52:20.163953098Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.148-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 07:52:20.164006 env[1458]: time="2024-02-13T07:52:20.163969603Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 07:52:20.165812 env[1458]: time="2024-02-13T07:52:20.165797324Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 07:52:20.165865 env[1458]: time="2024-02-13T07:52:20.165811060Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 07:52:20.165865 env[1458]: time="2024-02-13T07:52:20.165823429Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Feb 13 07:52:20.165865 env[1458]: time="2024-02-13T07:52:20.165833300Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 07:52:20.165950 env[1458]: time="2024-02-13T07:52:20.165893053Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 07:52:20.166055 env[1458]: time="2024-02-13T07:52:20.166044376Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 07:52:20.166138 env[1458]: time="2024-02-13T07:52:20.166125113Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 07:52:20.166171 env[1458]: time="2024-02-13T07:52:20.166136996Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 07:52:20.167945 env[1458]: time="2024-02-13T07:52:20.167931316Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Feb 13 07:52:20.167989 env[1458]: time="2024-02-13T07:52:20.167945004Z" level=info msg="metadata content store policy set" policy=shared Feb 13 07:52:20.170292 bash[1481]: Updated "/home/core/.ssh/authorized_keys" Feb 13 07:52:20.170915 systemd[1]: Finished update-ssh-keys-after-ignition.service. Feb 13 07:52:20.171739 systemd-networkd[1338]: bond0: Gained IPv6LL Feb 13 07:52:20.175221 env[1458]: time="2024-02-13T07:52:20.175207651Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 07:52:20.175263 env[1458]: time="2024-02-13T07:52:20.175226087Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 07:52:20.175263 env[1458]: time="2024-02-13T07:52:20.175240623Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 07:52:20.175326 env[1458]: time="2024-02-13T07:52:20.175268796Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.175326 env[1458]: time="2024-02-13T07:52:20.175285980Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.175326 env[1458]: time="2024-02-13T07:52:20.175300458Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.175326 env[1458]: time="2024-02-13T07:52:20.175312359Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.175429 env[1458]: time="2024-02-13T07:52:20.175326620Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.175429 env[1458]: time="2024-02-13T07:52:20.175343518Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.175429 env[1458]: time="2024-02-13T07:52:20.175357438Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.175429 env[1458]: time="2024-02-13T07:52:20.175369200Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.175429 env[1458]: time="2024-02-13T07:52:20.175381237Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 07:52:20.175559 env[1458]: time="2024-02-13T07:52:20.175442755Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 07:52:20.175559 env[1458]: time="2024-02-13T07:52:20.175510833Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 07:52:20.175966 env[1458]: time="2024-02-13T07:52:20.175945784Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 07:52:20.176026 env[1458]: time="2024-02-13T07:52:20.176015340Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176060 env[1458]: time="2024-02-13T07:52:20.176031612Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 07:52:20.176089 env[1458]: time="2024-02-13T07:52:20.176075497Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176118 env[1458]: time="2024-02-13T07:52:20.176089477Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176118 env[1458]: time="2024-02-13T07:52:20.176101493Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176118 env[1458]: time="2024-02-13T07:52:20.176113570Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176194 env[1458]: time="2024-02-13T07:52:20.176124724Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176194 env[1458]: time="2024-02-13T07:52:20.176136638Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176194 env[1458]: time="2024-02-13T07:52:20.176147597Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176194 env[1458]: time="2024-02-13T07:52:20.176158887Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176194 env[1458]: time="2024-02-13T07:52:20.176174480Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 07:52:20.176337 env[1458]: time="2024-02-13T07:52:20.176271717Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176337 env[1458]: time="2024-02-13T07:52:20.176285849Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176337 env[1458]: time="2024-02-13T07:52:20.176298180Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176337 env[1458]: time="2024-02-13T07:52:20.176309072Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 07:52:20.176337 env[1458]: time="2024-02-13T07:52:20.176321942Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Feb 13 07:52:20.176337 env[1458]: time="2024-02-13T07:52:20.176334922Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 07:52:20.176493 env[1458]: time="2024-02-13T07:52:20.176350019Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Feb 13 07:52:20.176493 env[1458]: time="2024-02-13T07:52:20.176449974Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 07:52:20.176622 env[1458]: time="2024-02-13T07:52:20.176593964Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.176634516Z" level=info msg="Connect containerd service" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.176663920Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.176968744Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177073604Z" level=info msg="Start subscribing containerd event" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177091590Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177109220Z" level=info msg="Start recovering state" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177116735Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177140478Z" level=info msg="containerd successfully booted in 0.023552s" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177157440Z" level=info msg="Start event monitor" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177170275Z" level=info msg="Start snapshots syncer" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177181497Z" level=info msg="Start cni network conf syncer for default" Feb 13 07:52:20.178340 env[1458]: time="2024-02-13T07:52:20.177188966Z" level=info msg="Start streaming server" Feb 13 07:52:20.180803 systemd[1]: Started containerd.service. Feb 13 07:52:20.183452 tar[1451]: ./ptp Feb 13 07:52:20.189480 systemd[1]: Started locksmithd.service. Feb 13 07:52:20.195770 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 07:52:20.195861 systemd[1]: Reached target system-config.target. Feb 13 07:52:20.203739 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 07:52:20.203845 systemd[1]: Reached target user-config.target. Feb 13 07:52:20.207391 tar[1451]: ./vlan Feb 13 07:52:20.229605 tar[1451]: ./host-device Feb 13 07:52:20.249728 locksmithd[1496]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 07:52:20.251053 tar[1451]: ./tuning Feb 13 07:52:20.269987 tar[1451]: ./vrf Feb 13 07:52:20.289812 tar[1451]: ./sbr Feb 13 07:52:20.309183 tar[1451]: ./tap Feb 13 07:52:20.331334 tar[1451]: ./dhcp Feb 13 07:52:20.387624 tar[1451]: ./static Feb 13 07:52:20.403661 tar[1451]: ./firewall Feb 13 07:52:20.404763 tar[1453]: linux-amd64/LICENSE Feb 13 07:52:20.404823 tar[1453]: linux-amd64/README.md Feb 13 07:52:20.407406 systemd[1]: Finished prepare-helm.service. Feb 13 07:52:20.419164 systemd[1]: Finished prepare-critools.service. Feb 13 07:52:20.431988 tar[1451]: ./macvlan Feb 13 07:52:20.456434 tar[1451]: ./dummy Feb 13 07:52:20.468659 kernel: EXT4-fs (sdb9): resized filesystem to 116605649 Feb 13 07:52:20.497806 extend-filesystems[1433]: Filesystem at /dev/sdb9 is mounted on /; on-line resizing required Feb 13 07:52:20.497806 extend-filesystems[1433]: old_desc_blocks = 1, new_desc_blocks = 56 Feb 13 07:52:20.497806 extend-filesystems[1433]: The filesystem on /dev/sdb9 is now 116605649 (4k) blocks long. Feb 13 07:52:20.536681 extend-filesystems[1417]: Resized filesystem in /dev/sdb9 Feb 13 07:52:20.544672 tar[1451]: ./bridge Feb 13 07:52:20.544672 tar[1451]: ./ipvlan Feb 13 07:52:20.498337 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 07:52:20.498440 systemd[1]: Finished extend-filesystems.service. Feb 13 07:52:20.555201 tar[1451]: ./portmap Feb 13 07:52:20.575894 tar[1451]: ./host-local Feb 13 07:52:20.599724 systemd[1]: Finished prepare-cni-plugins.service. Feb 13 07:52:20.911793 sshd_keygen[1445]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 07:52:20.923251 systemd[1]: Finished sshd-keygen.service. Feb 13 07:52:20.930516 systemd[1]: Starting issuegen.service... Feb 13 07:52:20.937942 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 07:52:20.938011 systemd[1]: Finished issuegen.service. Feb 13 07:52:20.945487 systemd[1]: Starting systemd-user-sessions.service... Feb 13 07:52:20.954912 systemd[1]: Finished systemd-user-sessions.service. Feb 13 07:52:20.964281 systemd[1]: Started getty@tty1.service. Feb 13 07:52:20.972236 systemd[1]: Started serial-getty@ttyS1.service. Feb 13 07:52:20.980850 systemd[1]: Reached target getty.target. Feb 13 07:52:21.526704 kernel: mlx5_core 0000:01:00.0: lag map port 1:1 port 2:2 shared_fdb:0 Feb 13 07:52:25.932765 coreos-metadata[1412]: Feb 13 07:52:25.932 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 13 07:52:25.933550 coreos-metadata[1409]: Feb 13 07:52:25.932 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 13 07:52:25.992642 login[1521]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 07:52:26.001085 login[1520]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 07:52:26.001828 systemd-logind[1446]: New session 1 of user core. Feb 13 07:52:26.002399 systemd[1]: Created slice user-500.slice. Feb 13 07:52:26.002951 systemd[1]: Starting user-runtime-dir@500.service... Feb 13 07:52:26.004158 systemd-logind[1446]: New session 2 of user core. Feb 13 07:52:26.008088 systemd[1]: Finished user-runtime-dir@500.service. Feb 13 07:52:26.008753 systemd[1]: Starting user@500.service... Feb 13 07:52:26.010601 (systemd)[1525]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 07:52:26.077940 systemd[1525]: Queued start job for default target default.target. Feb 13 07:52:26.078161 systemd[1525]: Reached target paths.target. Feb 13 07:52:26.078172 systemd[1525]: Reached target sockets.target. Feb 13 07:52:26.078180 systemd[1525]: Reached target timers.target. Feb 13 07:52:26.078187 systemd[1525]: Reached target basic.target. Feb 13 07:52:26.078206 systemd[1525]: Reached target default.target. Feb 13 07:52:26.078219 systemd[1525]: Startup finished in 64ms. Feb 13 07:52:26.078266 systemd[1]: Started user@500.service. Feb 13 07:52:26.078814 systemd[1]: Started session-1.scope. Feb 13 07:52:26.079151 systemd[1]: Started session-2.scope. Feb 13 07:52:26.569999 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:2 port 2:2 Feb 13 07:52:26.570156 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:1 port 2:2 Feb 13 07:52:26.933107 coreos-metadata[1412]: Feb 13 07:52:26.932 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 13 07:52:26.933852 coreos-metadata[1409]: Feb 13 07:52:26.932 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 13 07:52:26.965125 systemd[1]: Created slice system-sshd.slice. Feb 13 07:52:26.965854 systemd[1]: Started sshd@0-145.40.90.207:22-139.178.68.195:51316.service. Feb 13 07:52:26.979819 coreos-metadata[1412]: Feb 13 07:52:26.979 INFO Fetch successful Feb 13 07:52:26.980210 coreos-metadata[1409]: Feb 13 07:52:26.980 INFO Fetch successful Feb 13 07:52:27.003298 systemd[1]: Finished coreos-metadata.service. Feb 13 07:52:27.004057 systemd[1]: Started packet-phone-home.service. Feb 13 07:52:27.004790 unknown[1409]: wrote ssh authorized keys file for user: core Feb 13 07:52:27.009515 sshd[1546]: Accepted publickey for core from 139.178.68.195 port 51316 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 07:52:27.010284 sshd[1546]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 07:52:27.010406 curl[1550]: % Total % Received % Xferd Average Speed Time Time Time Current Feb 13 07:52:27.010528 curl[1550]: Dload Upload Total Spent Left Speed Feb 13 07:52:27.012680 systemd-logind[1446]: New session 3 of user core. Feb 13 07:52:27.013150 systemd[1]: Started session-3.scope. Feb 13 07:52:27.016509 update-ssh-keys[1551]: Updated "/home/core/.ssh/authorized_keys" Feb 13 07:52:27.016752 systemd[1]: Finished coreos-metadata-sshkeys@core.service. Feb 13 07:52:27.016905 systemd[1]: Reached target multi-user.target. Feb 13 07:52:27.017523 systemd[1]: Starting systemd-update-utmp-runlevel.service... Feb 13 07:52:27.021461 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Feb 13 07:52:27.021533 systemd[1]: Finished systemd-update-utmp-runlevel.service. Feb 13 07:52:27.022930 systemd[1]: Startup finished in 1.899s (kernel) + 19.061s (initrd) + 14.130s (userspace) = 35.092s. Feb 13 07:52:27.068356 systemd[1]: Started sshd@1-145.40.90.207:22-139.178.68.195:51320.service. Feb 13 07:52:27.104737 sshd[1556]: Accepted publickey for core from 139.178.68.195 port 51320 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 07:52:27.105416 sshd[1556]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 07:52:27.107674 systemd-logind[1446]: New session 4 of user core. Feb 13 07:52:27.108135 systemd[1]: Started session-4.scope. Feb 13 07:52:27.137220 systemd-timesyncd[1402]: Contacted time server 68.64.173.196:123 (0.flatcar.pool.ntp.org). Feb 13 07:52:27.137266 systemd-timesyncd[1402]: Initial clock synchronization to Tue 2024-02-13 07:52:26.993435 UTC. Feb 13 07:52:27.159252 sshd[1556]: pam_unix(sshd:session): session closed for user core Feb 13 07:52:27.162331 systemd[1]: sshd@1-145.40.90.207:22-139.178.68.195:51320.service: Deactivated successfully. Feb 13 07:52:27.163126 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 07:52:27.164066 systemd-logind[1446]: Session 4 logged out. Waiting for processes to exit. Feb 13 07:52:27.165538 systemd[1]: Started sshd@2-145.40.90.207:22-139.178.68.195:51336.service. Feb 13 07:52:27.166677 systemd-logind[1446]: Removed session 4. Feb 13 07:52:27.194968 curl[1550]: \u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 Feb 13 07:52:27.196899 systemd[1]: packet-phone-home.service: Deactivated successfully. Feb 13 07:52:27.226311 sshd[1562]: Accepted publickey for core from 139.178.68.195 port 51336 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 07:52:27.228903 sshd[1562]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 07:52:27.237525 systemd-logind[1446]: New session 5 of user core. Feb 13 07:52:27.239877 systemd[1]: Started session-5.scope. Feb 13 07:52:27.307481 sshd[1562]: pam_unix(sshd:session): session closed for user core Feb 13 07:52:27.309099 systemd[1]: sshd@2-145.40.90.207:22-139.178.68.195:51336.service: Deactivated successfully. Feb 13 07:52:27.309413 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 07:52:27.309754 systemd-logind[1446]: Session 5 logged out. Waiting for processes to exit. Feb 13 07:52:27.310287 systemd[1]: Started sshd@3-145.40.90.207:22-139.178.68.195:51352.service. Feb 13 07:52:27.310628 systemd-logind[1446]: Removed session 5. Feb 13 07:52:27.344722 sshd[1569]: Accepted publickey for core from 139.178.68.195 port 51352 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 07:52:27.345871 sshd[1569]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 07:52:27.350004 systemd-logind[1446]: New session 6 of user core. Feb 13 07:52:27.350949 systemd[1]: Started session-6.scope. Feb 13 07:52:27.420458 sshd[1569]: pam_unix(sshd:session): session closed for user core Feb 13 07:52:27.427019 systemd[1]: sshd@3-145.40.90.207:22-139.178.68.195:51352.service: Deactivated successfully. Feb 13 07:52:27.428582 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 07:52:27.430288 systemd-logind[1446]: Session 6 logged out. Waiting for processes to exit. Feb 13 07:52:27.432847 systemd[1]: Started sshd@4-145.40.90.207:22-139.178.68.195:51354.service. Feb 13 07:52:27.435241 systemd-logind[1446]: Removed session 6. Feb 13 07:52:27.502134 sshd[1575]: Accepted publickey for core from 139.178.68.195 port 51354 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 07:52:27.505196 sshd[1575]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 07:52:27.514567 systemd-logind[1446]: New session 7 of user core. Feb 13 07:52:27.516727 systemd[1]: Started session-7.scope. Feb 13 07:52:27.613858 sudo[1578]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 07:52:27.614458 sudo[1578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 07:52:27.636042 dbus-daemon[1415]: \xd0}y\xbe\x83U: received setenforce notice (enforcing=34235488) Feb 13 07:52:27.641133 sudo[1578]: pam_unix(sudo:session): session closed for user root Feb 13 07:52:27.646269 sshd[1575]: pam_unix(sshd:session): session closed for user core Feb 13 07:52:27.653284 systemd[1]: sshd@4-145.40.90.207:22-139.178.68.195:51354.service: Deactivated successfully. Feb 13 07:52:27.654947 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 07:52:27.656673 systemd-logind[1446]: Session 7 logged out. Waiting for processes to exit. Feb 13 07:52:27.659326 systemd[1]: Started sshd@5-145.40.90.207:22-139.178.68.195:51356.service. Feb 13 07:52:27.661727 systemd-logind[1446]: Removed session 7. Feb 13 07:52:27.729976 sshd[1582]: Accepted publickey for core from 139.178.68.195 port 51356 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 07:52:27.732973 sshd[1582]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 07:52:27.742506 systemd-logind[1446]: New session 8 of user core. Feb 13 07:52:27.744675 systemd[1]: Started session-8.scope. Feb 13 07:52:27.817795 sudo[1586]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 07:52:27.817901 sudo[1586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 07:52:27.819647 sudo[1586]: pam_unix(sudo:session): session closed for user root Feb 13 07:52:27.821874 sudo[1585]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Feb 13 07:52:27.821979 sudo[1585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 07:52:27.827294 systemd[1]: Stopping audit-rules.service... Feb 13 07:52:27.827000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 13 07:52:27.828152 auditctl[1589]: No rules Feb 13 07:52:27.828311 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 07:52:27.828397 systemd[1]: Stopped audit-rules.service. Feb 13 07:52:27.829255 systemd[1]: Starting audit-rules.service... Feb 13 07:52:27.833486 kernel: kauditd_printk_skb: 63 callbacks suppressed Feb 13 07:52:27.833516 kernel: audit: type=1305 audit(1707810747.827:164): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 13 07:52:27.838676 augenrules[1606]: No rules Feb 13 07:52:27.838952 systemd[1]: Finished audit-rules.service. Feb 13 07:52:27.839360 sudo[1585]: pam_unix(sudo:session): session closed for user root Feb 13 07:52:27.840140 sshd[1582]: pam_unix(sshd:session): session closed for user core Feb 13 07:52:27.841880 systemd[1]: sshd@5-145.40.90.207:22-139.178.68.195:51356.service: Deactivated successfully. Feb 13 07:52:27.842229 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 07:52:27.842548 systemd-logind[1446]: Session 8 logged out. Waiting for processes to exit. Feb 13 07:52:27.843135 systemd[1]: Started sshd@6-145.40.90.207:22-139.178.68.195:51368.service. Feb 13 07:52:27.843565 systemd-logind[1446]: Removed session 8. Feb 13 07:52:27.827000 audit[1589]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffcf88a29f0 a2=420 a3=0 items=0 ppid=1 pid=1589 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:27.880099 kernel: audit: type=1300 audit(1707810747.827:164): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffcf88a29f0 a2=420 a3=0 items=0 ppid=1 pid=1589 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:27.880166 kernel: audit: type=1327 audit(1707810747.827:164): proctitle=2F7362696E2F617564697463746C002D44 Feb 13 07:52:27.827000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Feb 13 07:52:27.827000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.907750 sshd[1612]: Accepted publickey for core from 139.178.68.195 port 51368 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 07:52:27.908941 sshd[1612]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 07:52:27.911129 systemd-logind[1446]: New session 9 of user core. Feb 13 07:52:27.911764 systemd[1]: Started session-9.scope. Feb 13 07:52:27.912065 kernel: audit: type=1131 audit(1707810747.827:165): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.912090 kernel: audit: type=1130 audit(1707810747.838:166): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.838000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.934525 kernel: audit: type=1106 audit(1707810747.838:167): pid=1585 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.838000 audit[1585]: USER_END pid=1585 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.958204 sudo[1615]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 07:52:27.958311 sudo[1615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 07:52:27.960486 kernel: audit: type=1104 audit(1707810747.838:168): pid=1585 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.838000 audit[1585]: CRED_DISP pid=1585 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.984022 kernel: audit: type=1106 audit(1707810747.840:169): pid=1582 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:27.840000 audit[1582]: USER_END pid=1582 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:28.015976 kernel: audit: type=1104 audit(1707810747.840:170): pid=1582 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:27.840000 audit[1582]: CRED_DISP pid=1582 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:28.041495 kernel: audit: type=1131 audit(1707810747.841:171): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-145.40.90.207:22-139.178.68.195:51356 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.841000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-145.40.90.207:22-139.178.68.195:51356 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.842000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.90.207:22-139.178.68.195:51368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.907000 audit[1612]: USER_ACCT pid=1612 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:27.908000 audit[1612]: CRED_ACQ pid=1612 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:27.908000 audit[1612]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc86be0e30 a2=3 a3=0 items=0 ppid=1 pid=1612 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:27.908000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 07:52:27.913000 audit[1612]: USER_START pid=1612 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:27.913000 audit[1614]: CRED_ACQ pid=1614 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:27.957000 audit[1615]: USER_ACCT pid=1615 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.957000 audit[1615]: CRED_REFR pid=1615 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:27.958000 audit[1615]: USER_START pid=1615 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:29.988484 systemd[1]: Starting systemd-networkd-wait-online.service... Feb 13 07:52:29.992519 systemd[1]: Finished systemd-networkd-wait-online.service. Feb 13 07:52:29.991000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-wait-online comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:29.992788 systemd[1]: Reached target network-online.target. Feb 13 07:52:29.993501 systemd[1]: Starting docker.service... Feb 13 07:52:30.011723 env[1636]: time="2024-02-13T07:52:30.011664940Z" level=info msg="Starting up" Feb 13 07:52:30.012397 env[1636]: time="2024-02-13T07:52:30.012356810Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 13 07:52:30.012397 env[1636]: time="2024-02-13T07:52:30.012367235Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 13 07:52:30.012397 env[1636]: time="2024-02-13T07:52:30.012380276Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 13 07:52:30.012397 env[1636]: time="2024-02-13T07:52:30.012386732Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 13 07:52:30.013320 env[1636]: time="2024-02-13T07:52:30.013280162Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 13 07:52:30.013320 env[1636]: time="2024-02-13T07:52:30.013290236Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 13 07:52:30.013320 env[1636]: time="2024-02-13T07:52:30.013299168Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 13 07:52:30.013320 env[1636]: time="2024-02-13T07:52:30.013304994Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 13 07:52:30.159903 env[1636]: time="2024-02-13T07:52:30.159824357Z" level=info msg="Loading containers: start." Feb 13 07:52:30.220000 audit[1679]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1679 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.220000 audit[1679]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffdbb30d580 a2=0 a3=7ffdbb30d56c items=0 ppid=1636 pid=1679 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.220000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Feb 13 07:52:30.221000 audit[1681]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1681 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.221000 audit[1681]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fff5a99a1f0 a2=0 a3=7fff5a99a1dc items=0 ppid=1636 pid=1681 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.221000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Feb 13 07:52:30.222000 audit[1683]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1683 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.222000 audit[1683]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd0df59bc0 a2=0 a3=7ffd0df59bac items=0 ppid=1636 pid=1683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.222000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 07:52:30.223000 audit[1685]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1685 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.223000 audit[1685]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffeef806830 a2=0 a3=7ffeef80681c items=0 ppid=1636 pid=1685 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.223000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 13 07:52:30.225000 audit[1687]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_rule pid=1687 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.225000 audit[1687]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd8df5d9e0 a2=0 a3=7ffd8df5d9cc items=0 ppid=1636 pid=1687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.225000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Feb 13 07:52:30.267000 audit[1692]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_rule pid=1692 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.267000 audit[1692]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fffcc012410 a2=0 a3=7fffcc0123fc items=0 ppid=1636 pid=1692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.267000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Feb 13 07:52:30.272000 audit[1694]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1694 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.272000 audit[1694]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcbda4f3e0 a2=0 a3=7ffcbda4f3cc items=0 ppid=1636 pid=1694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.272000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Feb 13 07:52:30.276000 audit[1696]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=1696 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.276000 audit[1696]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffe1dc94520 a2=0 a3=7ffe1dc9450c items=0 ppid=1636 pid=1696 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.276000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Feb 13 07:52:30.280000 audit[1698]: NETFILTER_CFG table=filter:10 family=2 entries=2 op=nft_register_chain pid=1698 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.280000 audit[1698]: SYSCALL arch=c000003e syscall=46 success=yes exit=308 a0=3 a1=7ffe3ec2da60 a2=0 a3=7ffe3ec2da4c items=0 ppid=1636 pid=1698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.280000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 07:52:30.293000 audit[1702]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_unregister_rule pid=1702 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.293000 audit[1702]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffd2e472230 a2=0 a3=7ffd2e47221c items=0 ppid=1636 pid=1702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.293000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 13 07:52:30.296000 audit[1703]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1703 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.296000 audit[1703]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffc80cdbdd0 a2=0 a3=7ffc80cdbdbc items=0 ppid=1636 pid=1703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.296000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 07:52:30.320670 kernel: Initializing XFRM netlink socket Feb 13 07:52:30.383178 env[1636]: time="2024-02-13T07:52:30.383138653Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Feb 13 07:52:30.392000 audit[1711]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=1711 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.392000 audit[1711]: SYSCALL arch=c000003e syscall=46 success=yes exit=492 a0=3 a1=7ffdc15e5da0 a2=0 a3=7ffdc15e5d8c items=0 ppid=1636 pid=1711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.392000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Feb 13 07:52:30.431000 audit[1714]: NETFILTER_CFG table=nat:14 family=2 entries=1 op=nft_register_rule pid=1714 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.431000 audit[1714]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffe53f09140 a2=0 a3=7ffe53f0912c items=0 ppid=1636 pid=1714 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.431000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Feb 13 07:52:30.438000 audit[1717]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=1717 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.438000 audit[1717]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff2f442fd0 a2=0 a3=7fff2f442fbc items=0 ppid=1636 pid=1717 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.438000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Feb 13 07:52:30.443000 audit[1719]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=1719 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.443000 audit[1719]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff991ee2d0 a2=0 a3=7fff991ee2bc items=0 ppid=1636 pid=1719 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.443000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Feb 13 07:52:30.446000 audit[1721]: NETFILTER_CFG table=nat:17 family=2 entries=2 op=nft_register_chain pid=1721 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.446000 audit[1721]: SYSCALL arch=c000003e syscall=46 success=yes exit=356 a0=3 a1=7ffeae3c9020 a2=0 a3=7ffeae3c900c items=0 ppid=1636 pid=1721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.446000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Feb 13 07:52:30.452000 audit[1723]: NETFILTER_CFG table=nat:18 family=2 entries=2 op=nft_register_chain pid=1723 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.452000 audit[1723]: SYSCALL arch=c000003e syscall=46 success=yes exit=444 a0=3 a1=7ffd0e52bfa0 a2=0 a3=7ffd0e52bf8c items=0 ppid=1636 pid=1723 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.452000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Feb 13 07:52:30.456000 audit[1725]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=1725 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.456000 audit[1725]: SYSCALL arch=c000003e syscall=46 success=yes exit=304 a0=3 a1=7ffc65b8f870 a2=0 a3=7ffc65b8f85c items=0 ppid=1636 pid=1725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.456000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Feb 13 07:52:30.479000 audit[1728]: NETFILTER_CFG table=filter:20 family=2 entries=1 op=nft_register_rule pid=1728 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.479000 audit[1728]: SYSCALL arch=c000003e syscall=46 success=yes exit=508 a0=3 a1=7ffeee4d2e90 a2=0 a3=7ffeee4d2e7c items=0 ppid=1636 pid=1728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.479000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Feb 13 07:52:30.484000 audit[1730]: NETFILTER_CFG table=filter:21 family=2 entries=1 op=nft_register_rule pid=1730 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.484000 audit[1730]: SYSCALL arch=c000003e syscall=46 success=yes exit=240 a0=3 a1=7fff0df42920 a2=0 a3=7fff0df4290c items=0 ppid=1636 pid=1730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.484000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 07:52:30.488000 audit[1732]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=1732 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.488000 audit[1732]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffe096bd1c0 a2=0 a3=7ffe096bd1ac items=0 ppid=1636 pid=1732 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.488000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 13 07:52:30.493000 audit[1734]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=1734 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.493000 audit[1734]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe2ebed020 a2=0 a3=7ffe2ebed00c items=0 ppid=1636 pid=1734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.493000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Feb 13 07:52:30.495343 systemd-networkd[1338]: docker0: Link UP Feb 13 07:52:30.509000 audit[1738]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=1738 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.509000 audit[1738]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd06a1df60 a2=0 a3=7ffd06a1df4c items=0 ppid=1636 pid=1738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.509000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 13 07:52:30.511000 audit[1739]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=1739 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:30.511000 audit[1739]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffcb7e11bc0 a2=0 a3=7ffcb7e11bac items=0 ppid=1636 pid=1739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:30.511000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 07:52:30.513111 env[1636]: time="2024-02-13T07:52:30.513015641Z" level=info msg="Loading containers: done." Feb 13 07:52:30.533217 env[1636]: time="2024-02-13T07:52:30.533151447Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 07:52:30.533526 env[1636]: time="2024-02-13T07:52:30.533494062Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Feb 13 07:52:30.533786 env[1636]: time="2024-02-13T07:52:30.533712859Z" level=info msg="Daemon has completed initialization" Feb 13 07:52:30.534265 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3990701994-merged.mount: Deactivated successfully. Feb 13 07:52:30.556736 systemd[1]: Started docker.service. Feb 13 07:52:30.556000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:30.566886 env[1636]: time="2024-02-13T07:52:30.566771392Z" level=info msg="API listen on /run/docker.sock" Feb 13 07:52:30.611045 systemd[1]: Reloading. Feb 13 07:52:30.670647 /usr/lib/systemd/system-generators/torcx-generator[1791]: time="2024-02-13T07:52:30Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 07:52:30.670688 /usr/lib/systemd/system-generators/torcx-generator[1791]: time="2024-02-13T07:52:30Z" level=info msg="torcx already run" Feb 13 07:52:30.781781 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 07:52:30.781792 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 07:52:30.796015 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit: BPF prog-id=34 op=LOAD Feb 13 07:52:30.841000 audit: BPF prog-id=24 op=UNLOAD Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit: BPF prog-id=35 op=LOAD Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.841000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit: BPF prog-id=36 op=LOAD Feb 13 07:52:30.842000 audit: BPF prog-id=25 op=UNLOAD Feb 13 07:52:30.842000 audit: BPF prog-id=26 op=UNLOAD Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit: BPF prog-id=37 op=LOAD Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit: BPF prog-id=38 op=LOAD Feb 13 07:52:30.842000 audit: BPF prog-id=21 op=UNLOAD Feb 13 07:52:30.842000 audit: BPF prog-id=22 op=UNLOAD Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.842000 audit: BPF prog-id=39 op=LOAD Feb 13 07:52:30.842000 audit: BPF prog-id=23 op=UNLOAD Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit: BPF prog-id=40 op=LOAD Feb 13 07:52:30.843000 audit: BPF prog-id=27 op=UNLOAD Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit: BPF prog-id=41 op=LOAD Feb 13 07:52:30.843000 audit: BPF prog-id=18 op=UNLOAD Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit: BPF prog-id=42 op=LOAD Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.843000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit: BPF prog-id=43 op=LOAD Feb 13 07:52:30.844000 audit: BPF prog-id=19 op=UNLOAD Feb 13 07:52:30.844000 audit: BPF prog-id=20 op=UNLOAD Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit: BPF prog-id=44 op=LOAD Feb 13 07:52:30.844000 audit: BPF prog-id=29 op=UNLOAD Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit: BPF prog-id=45 op=LOAD Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.844000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit: BPF prog-id=46 op=LOAD Feb 13 07:52:30.845000 audit: BPF prog-id=30 op=UNLOAD Feb 13 07:52:30.845000 audit: BPF prog-id=31 op=UNLOAD Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.845000 audit: BPF prog-id=47 op=LOAD Feb 13 07:52:30.845000 audit: BPF prog-id=32 op=UNLOAD Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:30.846000 audit: BPF prog-id=48 op=LOAD Feb 13 07:52:30.846000 audit: BPF prog-id=28 op=UNLOAD Feb 13 07:52:30.851107 systemd[1]: Started kubelet.service. Feb 13 07:52:30.850000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:30.874274 kubelet[1847]: E0213 07:52:30.874220 1847 run.go:74] "command failed" err="failed to load kubelet config file, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory, path: /var/lib/kubelet/config.yaml" Feb 13 07:52:30.875398 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 07:52:30.875465 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 07:52:30.874000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 07:52:31.497975 env[1458]: time="2024-02-13T07:52:31.497853200Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.27.10\"" Feb 13 07:52:32.085136 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2899797885.mount: Deactivated successfully. Feb 13 07:52:33.301993 env[1458]: time="2024-02-13T07:52:33.301928481Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.27.10,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:33.302509 env[1458]: time="2024-02-13T07:52:33.302464640Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7968fc5c824ed95404f421a90882835f250220c0fd799b4fceef340dd5585ed5,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:33.303568 env[1458]: time="2024-02-13T07:52:33.303527117Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.27.10,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:33.304562 env[1458]: time="2024-02-13T07:52:33.304514372Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:cfcebda74d6e665b68931d3589ee69fde81cd503ff3169888e4502af65579d98,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:33.305460 env[1458]: time="2024-02-13T07:52:33.305421747Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.27.10\" returns image reference \"sha256:7968fc5c824ed95404f421a90882835f250220c0fd799b4fceef340dd5585ed5\"" Feb 13 07:52:33.310731 env[1458]: time="2024-02-13T07:52:33.310701986Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.27.10\"" Feb 13 07:52:34.832707 env[1458]: time="2024-02-13T07:52:34.832675335Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.27.10,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:34.833268 env[1458]: time="2024-02-13T07:52:34.833258441Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:c8134be729ba23c6e0c3e5dd52c393fc8d3cfc688bcec33540f64bb0137b67e0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:34.834228 env[1458]: time="2024-02-13T07:52:34.834183960Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.27.10,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:34.835409 env[1458]: time="2024-02-13T07:52:34.835375585Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:fa168ebca1f6dbfe86ef0a690e007531c1f53569274fc7dc2774fe228b6ce8c2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:34.835811 env[1458]: time="2024-02-13T07:52:34.835768761Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.27.10\" returns image reference \"sha256:c8134be729ba23c6e0c3e5dd52c393fc8d3cfc688bcec33540f64bb0137b67e0\"" Feb 13 07:52:34.845410 env[1458]: time="2024-02-13T07:52:34.845335839Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.27.10\"" Feb 13 07:52:35.921034 env[1458]: time="2024-02-13T07:52:35.920985008Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.27.10,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:35.921678 env[1458]: time="2024-02-13T07:52:35.921606901Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5eed9876e7181341b7015e3486dfd234f8e0d0d7d3d19b1bb971d720cd320975,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:35.922780 env[1458]: time="2024-02-13T07:52:35.922731808Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.27.10,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:35.923761 env[1458]: time="2024-02-13T07:52:35.923722530Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:09294de61e63987f181077cbc2f5c82463878af9cd8ecc6110c54150c9ae3143,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:35.924201 env[1458]: time="2024-02-13T07:52:35.924143910Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.27.10\" returns image reference \"sha256:5eed9876e7181341b7015e3486dfd234f8e0d0d7d3d19b1bb971d720cd320975\"" Feb 13 07:52:35.931226 env[1458]: time="2024-02-13T07:52:35.931206819Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.27.10\"" Feb 13 07:52:36.784408 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2307186035.mount: Deactivated successfully. Feb 13 07:52:37.115483 env[1458]: time="2024-02-13T07:52:37.115392221Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.27.10,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:37.116171 env[1458]: time="2024-02-13T07:52:37.116137377Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:db7b01e105753475c198490cf875df1314fd1a599f67ea1b184586cb399e1cae,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:37.116765 env[1458]: time="2024-02-13T07:52:37.116727513Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.27.10,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:37.117351 env[1458]: time="2024-02-13T07:52:37.117317556Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:d084b53c772f62ec38fddb2348a82d4234016daf6cd43fedbf0b3281f3790f88,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:37.117663 env[1458]: time="2024-02-13T07:52:37.117620980Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.27.10\" returns image reference \"sha256:db7b01e105753475c198490cf875df1314fd1a599f67ea1b184586cb399e1cae\"" Feb 13 07:52:37.123025 env[1458]: time="2024-02-13T07:52:37.122987262Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Feb 13 07:52:37.639891 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2675483445.mount: Deactivated successfully. Feb 13 07:52:37.641144 env[1458]: time="2024-02-13T07:52:37.641096752Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:37.641674 env[1458]: time="2024-02-13T07:52:37.641650200Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:37.642264 env[1458]: time="2024-02-13T07:52:37.642225800Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:37.643280 env[1458]: time="2024-02-13T07:52:37.643232772Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:37.643495 env[1458]: time="2024-02-13T07:52:37.643452755Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Feb 13 07:52:37.648754 env[1458]: time="2024-02-13T07:52:37.648707630Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.7-0\"" Feb 13 07:52:38.331165 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3957880694.mount: Deactivated successfully. Feb 13 07:52:41.019814 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 07:52:41.018000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:41.019953 systemd[1]: Stopped kubelet.service. Feb 13 07:52:41.021493 systemd[1]: Started kubelet.service. Feb 13 07:52:41.025613 kernel: kauditd_printk_skb: 259 callbacks suppressed Feb 13 07:52:41.025671 kernel: audit: type=1130 audit(1707810761.018:381): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:41.018000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:41.047900 kubelet[1938]: E0213 07:52:41.047768 1938 run.go:74] "command failed" err="failed to load kubelet config file, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory, path: /var/lib/kubelet/config.yaml" Feb 13 07:52:41.050149 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 07:52:41.050294 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 07:52:41.102172 kernel: audit: type=1131 audit(1707810761.018:382): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:41.102208 kernel: audit: type=1130 audit(1707810761.020:383): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:41.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:41.157351 kernel: audit: type=1131 audit(1707810761.049:384): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 07:52:41.049000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 07:52:41.194208 env[1458]: time="2024-02-13T07:52:41.194127389Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.7-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:41.194906 env[1458]: time="2024-02-13T07:52:41.194868470Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:86b6af7dd652c1b38118be1c338e9354b33469e69a218f7e290a0ca5304ad681,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:41.195768 env[1458]: time="2024-02-13T07:52:41.195728494Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.7-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:41.196533 env[1458]: time="2024-02-13T07:52:41.196492202Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:51eae8381dcb1078289fa7b4f3df2630cdc18d09fb56f8e56b41c40e191d6c83,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:41.196900 env[1458]: time="2024-02-13T07:52:41.196860629Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.7-0\" returns image reference \"sha256:86b6af7dd652c1b38118be1c338e9354b33469e69a218f7e290a0ca5304ad681\"" Feb 13 07:52:41.202385 env[1458]: time="2024-02-13T07:52:41.202370007Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.10.1\"" Feb 13 07:52:41.683729 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2283915025.mount: Deactivated successfully. Feb 13 07:52:42.150147 env[1458]: time="2024-02-13T07:52:42.150099963Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.10.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:42.150788 env[1458]: time="2024-02-13T07:52:42.150743791Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:42.151422 env[1458]: time="2024-02-13T07:52:42.151388718Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.10.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:42.152178 env[1458]: time="2024-02-13T07:52:42.152142856Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:a0ead06651cf580044aeb0a0feba63591858fb2e43ade8c9dea45a6a89ae7e5e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:42.152524 env[1458]: time="2024-02-13T07:52:42.152473375Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.10.1\" returns image reference \"sha256:ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc\"" Feb 13 07:52:43.778840 systemd[1]: Stopped kubelet.service. Feb 13 07:52:43.777000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:43.787193 systemd[1]: Reloading. Feb 13 07:52:43.816225 /usr/lib/systemd/system-generators/torcx-generator[2112]: time="2024-02-13T07:52:43Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 07:52:43.816249 /usr/lib/systemd/system-generators/torcx-generator[2112]: time="2024-02-13T07:52:43Z" level=info msg="torcx already run" Feb 13 07:52:43.777000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:43.838690 kernel: audit: type=1130 audit(1707810763.777:385): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:43.838749 kernel: audit: type=1131 audit(1707810763.777:386): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:43.916647 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 07:52:43.916656 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 07:52:43.927491 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.095011 kernel: audit: type=1400 audit(1707810763.972:387): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.095076 kernel: audit: type=1400 audit(1707810763.972:388): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.095091 kernel: audit: type=1400 audit(1707810763.972:389): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.157937 kernel: audit: type=1400 audit(1707810763.972:390): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:43.972000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit: BPF prog-id=49 op=LOAD Feb 13 07:52:44.094000 audit: BPF prog-id=34 op=UNLOAD Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.094000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit: BPF prog-id=50 op=LOAD Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit: BPF prog-id=51 op=LOAD Feb 13 07:52:44.219000 audit: BPF prog-id=35 op=UNLOAD Feb 13 07:52:44.219000 audit: BPF prog-id=36 op=UNLOAD Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit: BPF prog-id=52 op=LOAD Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.219000 audit: BPF prog-id=53 op=LOAD Feb 13 07:52:44.219000 audit: BPF prog-id=37 op=UNLOAD Feb 13 07:52:44.219000 audit: BPF prog-id=38 op=UNLOAD Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit: BPF prog-id=54 op=LOAD Feb 13 07:52:44.220000 audit: BPF prog-id=39 op=UNLOAD Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.220000 audit: BPF prog-id=55 op=LOAD Feb 13 07:52:44.220000 audit: BPF prog-id=40 op=UNLOAD Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit: BPF prog-id=56 op=LOAD Feb 13 07:52:44.221000 audit: BPF prog-id=41 op=UNLOAD Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit: BPF prog-id=57 op=LOAD Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.221000 audit: BPF prog-id=58 op=LOAD Feb 13 07:52:44.221000 audit: BPF prog-id=42 op=UNLOAD Feb 13 07:52:44.221000 audit: BPF prog-id=43 op=UNLOAD Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit: BPF prog-id=59 op=LOAD Feb 13 07:52:44.222000 audit: BPF prog-id=44 op=UNLOAD Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit: BPF prog-id=60 op=LOAD Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit: BPF prog-id=61 op=LOAD Feb 13 07:52:44.222000 audit: BPF prog-id=45 op=UNLOAD Feb 13 07:52:44.222000 audit: BPF prog-id=46 op=UNLOAD Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.222000 audit: BPF prog-id=62 op=LOAD Feb 13 07:52:44.222000 audit: BPF prog-id=47 op=UNLOAD Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.224000 audit: BPF prog-id=63 op=LOAD Feb 13 07:52:44.224000 audit: BPF prog-id=48 op=UNLOAD Feb 13 07:52:44.232570 systemd[1]: Started kubelet.service. Feb 13 07:52:44.231000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:44.255535 kubelet[2172]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 07:52:44.255535 kubelet[2172]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 07:52:44.255535 kubelet[2172]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 07:52:44.255794 kubelet[2172]: I0213 07:52:44.255611 2172 server.go:199] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 07:52:44.462429 kubelet[2172]: I0213 07:52:44.462376 2172 server.go:415] "Kubelet version" kubeletVersion="v1.27.2" Feb 13 07:52:44.462429 kubelet[2172]: I0213 07:52:44.462402 2172 server.go:417] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 07:52:44.462504 kubelet[2172]: I0213 07:52:44.462498 2172 server.go:837] "Client rotation is on, will bootstrap in background" Feb 13 07:52:44.465410 kubelet[2172]: I0213 07:52:44.465360 2172 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 07:52:44.465946 kubelet[2172]: E0213 07:52:44.465939 2172 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://145.40.90.207:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.483187 kubelet[2172]: I0213 07:52:44.483154 2172 server.go:662] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 07:52:44.483305 kubelet[2172]: I0213 07:52:44.483275 2172 container_manager_linux.go:266] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 07:52:44.483336 kubelet[2172]: I0213 07:52:44.483325 2172 container_manager_linux.go:271] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:systemd KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:} {Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:} {Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] TopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] PodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms TopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 13 07:52:44.483336 kubelet[2172]: I0213 07:52:44.483334 2172 topology_manager.go:136] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 13 07:52:44.483416 kubelet[2172]: I0213 07:52:44.483340 2172 container_manager_linux.go:302] "Creating device plugin manager" Feb 13 07:52:44.483416 kubelet[2172]: I0213 07:52:44.483385 2172 state_mem.go:36] "Initialized new in-memory state store" Feb 13 07:52:44.485537 kubelet[2172]: I0213 07:52:44.485485 2172 kubelet.go:405] "Attempting to sync node with API server" Feb 13 07:52:44.485537 kubelet[2172]: I0213 07:52:44.485509 2172 kubelet.go:298] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 07:52:44.485537 kubelet[2172]: I0213 07:52:44.485536 2172 kubelet.go:309] "Adding apiserver pod source" Feb 13 07:52:44.485609 kubelet[2172]: I0213 07:52:44.485543 2172 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 07:52:44.485890 kubelet[2172]: I0213 07:52:44.485876 2172 kuberuntime_manager.go:257] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 13 07:52:44.485965 kubelet[2172]: W0213 07:52:44.485880 2172 reflector.go:533] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://145.40.90.207:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.485965 kubelet[2172]: E0213 07:52:44.485933 2172 reflector.go:148] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://145.40.90.207:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.486012 kubelet[2172]: W0213 07:52:44.485938 2172 reflector.go:533] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://145.40.90.207:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-bf0bde3476&limit=500&resourceVersion=0": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.486012 kubelet[2172]: E0213 07:52:44.485999 2172 reflector.go:148] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://145.40.90.207:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-bf0bde3476&limit=500&resourceVersion=0": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.486117 kubelet[2172]: W0213 07:52:44.486085 2172 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 07:52:44.486362 kubelet[2172]: I0213 07:52:44.486354 2172 server.go:1168] "Started kubelet" Feb 13 07:52:44.486425 kubelet[2172]: I0213 07:52:44.486414 2172 ratelimit.go:65] "Setting rate limiting for podresources endpoint" qps=100 burstTokens=10 Feb 13 07:52:44.486425 kubelet[2172]: I0213 07:52:44.486415 2172 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 07:52:44.486566 kubelet[2172]: E0213 07:52:44.486558 2172 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 13 07:52:44.486593 kubelet[2172]: E0213 07:52:44.486573 2172 kubelet.go:1400] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 07:52:44.486617 kubelet[2172]: E0213 07:52:44.486542 2172 event.go:289] Unable to write event: '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-bf0bde3476.17b35cd788cbf592", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-bf0bde3476", UID:"ci-3510.3.2-a-bf0bde3476", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-bf0bde3476"}, FirstTimestamp:time.Date(2024, time.February, 13, 7, 52, 44, 486342034, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 7, 52, 44, 486342034, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'Post "https://145.40.90.207:6443/api/v1/namespaces/default/events": dial tcp 145.40.90.207:6443: connect: connection refused'(may retry after sleeping) Feb 13 07:52:44.486000 audit[2172]: AVC avc: denied { mac_admin } for pid=2172 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.486000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 07:52:44.486000 audit[2172]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000ba0030 a1=c000ba2000 a2=c000ba0000 a3=25 items=0 ppid=1 pid=2172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.486000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 07:52:44.486000 audit[2172]: AVC avc: denied { mac_admin } for pid=2172 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.486000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 07:52:44.486000 audit[2172]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000b9e020 a1=c000ba2018 a2=c000ba00c0 a3=25 items=0 ppid=1 pid=2172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.486000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 07:52:44.487720 kubelet[2172]: I0213 07:52:44.487508 2172 kubelet.go:1355] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 13 07:52:44.487720 kubelet[2172]: I0213 07:52:44.487530 2172 server.go:461] "Adding debug handlers to kubelet server" Feb 13 07:52:44.487720 kubelet[2172]: I0213 07:52:44.487540 2172 kubelet.go:1359] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 13 07:52:44.487779 kubelet[2172]: I0213 07:52:44.487747 2172 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 07:52:44.487816 kubelet[2172]: I0213 07:52:44.487806 2172 volume_manager.go:284] "Starting Kubelet Volume Manager" Feb 13 07:52:44.487847 kubelet[2172]: I0213 07:52:44.487839 2172 desired_state_of_world_populator.go:145] "Desired state populator starts to run" Feb 13 07:52:44.487971 kubelet[2172]: E0213 07:52:44.487963 2172 controller.go:146] "Failed to ensure lease exists, will retry" err="Get \"https://145.40.90.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-bf0bde3476?timeout=10s\": dial tcp 145.40.90.207:6443: connect: connection refused" interval="200ms" Feb 13 07:52:44.488009 kubelet[2172]: W0213 07:52:44.487978 2172 reflector.go:533] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://145.40.90.207:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.488033 kubelet[2172]: E0213 07:52:44.488009 2172 reflector.go:148] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://145.40.90.207:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.488000 audit[2196]: NETFILTER_CFG table=mangle:26 family=2 entries=2 op=nft_register_chain pid=2196 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:44.488000 audit[2196]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffe00422ca0 a2=0 a3=7ffe00422c8c items=0 ppid=2172 pid=2196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.488000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 13 07:52:44.489000 audit[2197]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_register_chain pid=2197 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:44.489000 audit[2197]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd2a423e80 a2=0 a3=7ffd2a423e6c items=0 ppid=2172 pid=2197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.489000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 13 07:52:44.490000 audit[2199]: NETFILTER_CFG table=filter:28 family=2 entries=2 op=nft_register_chain pid=2199 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:44.490000 audit[2199]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffeaf6d4f90 a2=0 a3=7ffeaf6d4f7c items=0 ppid=2172 pid=2199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.490000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 07:52:44.490000 audit[2201]: NETFILTER_CFG table=filter:29 family=2 entries=2 op=nft_register_chain pid=2201 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:44.490000 audit[2201]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffeddd05260 a2=0 a3=7ffeddd0524c items=0 ppid=2172 pid=2201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.490000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 07:52:44.494000 audit[2204]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2204 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:44.494000 audit[2204]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffebb30c2a0 a2=0 a3=7ffebb30c28c items=0 ppid=2172 pid=2204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.494000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Feb 13 07:52:44.494816 kubelet[2172]: I0213 07:52:44.494792 2172 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 13 07:52:44.494000 audit[2205]: NETFILTER_CFG table=mangle:31 family=10 entries=2 op=nft_register_chain pid=2205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:52:44.494000 audit[2205]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd33449060 a2=0 a3=7ffd3344904c items=0 ppid=2172 pid=2205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.494000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 13 07:52:44.495296 kubelet[2172]: I0213 07:52:44.495246 2172 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 13 07:52:44.495296 kubelet[2172]: I0213 07:52:44.495258 2172 status_manager.go:207] "Starting to sync pod status with apiserver" Feb 13 07:52:44.495296 kubelet[2172]: I0213 07:52:44.495268 2172 kubelet.go:2257] "Starting kubelet main sync loop" Feb 13 07:52:44.494000 audit[2207]: NETFILTER_CFG table=mangle:32 family=2 entries=1 op=nft_register_chain pid=2207 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:44.494000 audit[2207]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff1b8f8c10 a2=0 a3=7fff1b8f8bfc items=0 ppid=2172 pid=2207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.494000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 13 07:52:44.495575 kubelet[2172]: E0213 07:52:44.495567 2172 kubelet.go:2281] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 07:52:44.495736 kubelet[2172]: W0213 07:52:44.495656 2172 reflector.go:533] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://145.40.90.207:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.495736 kubelet[2172]: E0213 07:52:44.495689 2172 reflector.go:148] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://145.40.90.207:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 145.40.90.207:6443: connect: connection refused Feb 13 07:52:44.495000 audit[2208]: NETFILTER_CFG table=nat:33 family=2 entries=1 op=nft_register_chain pid=2208 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:44.495000 audit[2208]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc5a1711c0 a2=0 a3=7ffc5a1711ac items=0 ppid=2172 pid=2208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.495000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 13 07:52:44.495000 audit[2209]: NETFILTER_CFG table=mangle:34 family=10 entries=1 op=nft_register_chain pid=2209 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:52:44.495000 audit[2209]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcba498cb0 a2=0 a3=7ffcba498c9c items=0 ppid=2172 pid=2209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.495000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 13 07:52:44.495000 audit[2210]: NETFILTER_CFG table=filter:35 family=2 entries=1 op=nft_register_chain pid=2210 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:52:44.495000 audit[2210]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffc858fa30 a2=0 a3=7fffc858fa1c items=0 ppid=2172 pid=2210 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.495000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 13 07:52:44.495000 audit[2211]: NETFILTER_CFG table=nat:36 family=10 entries=2 op=nft_register_chain pid=2211 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:52:44.495000 audit[2211]: SYSCALL arch=c000003e syscall=46 success=yes exit=128 a0=3 a1=7ffe1a406ea0 a2=0 a3=7ffe1a406e8c items=0 ppid=2172 pid=2211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.495000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 13 07:52:44.496000 audit[2212]: NETFILTER_CFG table=filter:37 family=10 entries=2 op=nft_register_chain pid=2212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:52:44.496000 audit[2212]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffe25a70cd0 a2=0 a3=7ffe25a70cbc items=0 ppid=2172 pid=2212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.496000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 13 07:52:44.500755 kubelet[2172]: I0213 07:52:44.500720 2172 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 07:52:44.500755 kubelet[2172]: I0213 07:52:44.500727 2172 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 07:52:44.500755 kubelet[2172]: I0213 07:52:44.500735 2172 state_mem.go:36] "Initialized new in-memory state store" Feb 13 07:52:44.501565 kubelet[2172]: I0213 07:52:44.501530 2172 policy_none.go:49] "None policy: Start" Feb 13 07:52:44.501792 kubelet[2172]: I0213 07:52:44.501757 2172 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 13 07:52:44.501792 kubelet[2172]: I0213 07:52:44.501768 2172 state_mem.go:35] "Initializing new in-memory state store" Feb 13 07:52:44.504850 systemd[1]: Created slice kubepods.slice. Feb 13 07:52:44.506882 systemd[1]: Created slice kubepods-burstable.slice. Feb 13 07:52:44.508115 systemd[1]: Created slice kubepods-besteffort.slice. Feb 13 07:52:44.530061 kubelet[2172]: I0213 07:52:44.530010 2172 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 07:52:44.529000 audit[2172]: AVC avc: denied { mac_admin } for pid=2172 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:44.529000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 07:52:44.529000 audit[2172]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0008741b0 a1=c0006ea498 a2=c000874180 a3=25 items=0 ppid=1 pid=2172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:44.529000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 07:52:44.530895 kubelet[2172]: I0213 07:52:44.530169 2172 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 13 07:52:44.530895 kubelet[2172]: I0213 07:52:44.530594 2172 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 07:52:44.531385 kubelet[2172]: E0213 07:52:44.531335 2172 eviction_manager.go:262] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-3510.3.2-a-bf0bde3476\" not found" Feb 13 07:52:44.591777 kubelet[2172]: I0213 07:52:44.591673 2172 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.592431 kubelet[2172]: E0213 07:52:44.592356 2172 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.90.207:6443/api/v1/nodes\": dial tcp 145.40.90.207:6443: connect: connection refused" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.596620 kubelet[2172]: I0213 07:52:44.596540 2172 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:52:44.599665 kubelet[2172]: I0213 07:52:44.599584 2172 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:52:44.603197 kubelet[2172]: I0213 07:52:44.603114 2172 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:52:44.614719 systemd[1]: Created slice kubepods-burstable-pod0a87ed2b1942aaccd48ed0dcff5d4971.slice. Feb 13 07:52:44.630073 systemd[1]: Created slice kubepods-burstable-pod7e9cf9721535bab49bc5b91ca08afb11.slice. Feb 13 07:52:44.647796 systemd[1]: Created slice kubepods-burstable-podb134222ffd42ce786b8efd601ab30089.slice. Feb 13 07:52:44.689026 kubelet[2172]: E0213 07:52:44.688936 2172 controller.go:146] "Failed to ensure lease exists, will retry" err="Get \"https://145.40.90.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-bf0bde3476?timeout=10s\": dial tcp 145.40.90.207:6443: connect: connection refused" interval="400ms" Feb 13 07:52:44.789695 kubelet[2172]: I0213 07:52:44.789597 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0a87ed2b1942aaccd48ed0dcff5d4971-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-bf0bde3476\" (UID: \"0a87ed2b1942aaccd48ed0dcff5d4971\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.789695 kubelet[2172]: I0213 07:52:44.789705 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0a87ed2b1942aaccd48ed0dcff5d4971-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-bf0bde3476\" (UID: \"0a87ed2b1942aaccd48ed0dcff5d4971\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.790079 kubelet[2172]: I0213 07:52:44.789771 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b134222ffd42ce786b8efd601ab30089-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-bf0bde3476\" (UID: \"b134222ffd42ce786b8efd601ab30089\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.790079 kubelet[2172]: I0213 07:52:44.789927 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.790079 kubelet[2172]: I0213 07:52:44.790030 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.790356 kubelet[2172]: I0213 07:52:44.790091 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0a87ed2b1942aaccd48ed0dcff5d4971-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-bf0bde3476\" (UID: \"0a87ed2b1942aaccd48ed0dcff5d4971\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.790356 kubelet[2172]: I0213 07:52:44.790198 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.790356 kubelet[2172]: I0213 07:52:44.790341 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.790656 kubelet[2172]: I0213 07:52:44.790451 2172 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.796278 kubelet[2172]: I0213 07:52:44.796206 2172 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.796929 kubelet[2172]: E0213 07:52:44.796852 2172 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.90.207:6443/api/v1/nodes\": dial tcp 145.40.90.207:6443: connect: connection refused" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:44.928402 env[1458]: time="2024-02-13T07:52:44.928255371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-bf0bde3476,Uid:0a87ed2b1942aaccd48ed0dcff5d4971,Namespace:kube-system,Attempt:0,}" Feb 13 07:52:44.943231 env[1458]: time="2024-02-13T07:52:44.943124695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-bf0bde3476,Uid:7e9cf9721535bab49bc5b91ca08afb11,Namespace:kube-system,Attempt:0,}" Feb 13 07:52:44.955875 env[1458]: time="2024-02-13T07:52:44.955807731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-bf0bde3476,Uid:b134222ffd42ce786b8efd601ab30089,Namespace:kube-system,Attempt:0,}" Feb 13 07:52:45.090291 kubelet[2172]: E0213 07:52:45.090124 2172 controller.go:146] "Failed to ensure lease exists, will retry" err="Get \"https://145.40.90.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-bf0bde3476?timeout=10s\": dial tcp 145.40.90.207:6443: connect: connection refused" interval="800ms" Feb 13 07:52:45.201485 kubelet[2172]: I0213 07:52:45.201430 2172 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:45.202146 kubelet[2172]: E0213 07:52:45.202112 2172 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.90.207:6443/api/v1/nodes\": dial tcp 145.40.90.207:6443: connect: connection refused" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:45.453747 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2004106259.mount: Deactivated successfully. Feb 13 07:52:45.455253 env[1458]: time="2024-02-13T07:52:45.455237079Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.456146 env[1458]: time="2024-02-13T07:52:45.456116513Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.456623 env[1458]: time="2024-02-13T07:52:45.456611832Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.457964 env[1458]: time="2024-02-13T07:52:45.457953358Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.458300 env[1458]: time="2024-02-13T07:52:45.458290040Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.459108 env[1458]: time="2024-02-13T07:52:45.459096916Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.459901 env[1458]: time="2024-02-13T07:52:45.459889943Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.460961 env[1458]: time="2024-02-13T07:52:45.460950578Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.461849 env[1458]: time="2024-02-13T07:52:45.461820201Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.463043 env[1458]: time="2024-02-13T07:52:45.463030658Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.463444 env[1458]: time="2024-02-13T07:52:45.463399212Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.464193 env[1458]: time="2024-02-13T07:52:45.464154329Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:52:45.467106 env[1458]: time="2024-02-13T07:52:45.467035338Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 07:52:45.467106 env[1458]: time="2024-02-13T07:52:45.467066195Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 07:52:45.467106 env[1458]: time="2024-02-13T07:52:45.467076028Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 07:52:45.467214 env[1458]: time="2024-02-13T07:52:45.467151740Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/86b39390004d20c42f3f347dea41bc1e427f25b0e3283b5fe6e289bc6f14ddca pid=2221 runtime=io.containerd.runc.v2 Feb 13 07:52:45.469743 env[1458]: time="2024-02-13T07:52:45.469705776Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 07:52:45.469743 env[1458]: time="2024-02-13T07:52:45.469730001Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 07:52:45.469743 env[1458]: time="2024-02-13T07:52:45.469741066Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 07:52:45.469898 env[1458]: time="2024-02-13T07:52:45.469809272Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/3a52128fd831e5fbd9dbfc296e578d4005f39006c7fd0f20f56c0241c9e3a81b pid=2242 runtime=io.containerd.runc.v2 Feb 13 07:52:45.470398 env[1458]: time="2024-02-13T07:52:45.470373587Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 07:52:45.470398 env[1458]: time="2024-02-13T07:52:45.470393361Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 07:52:45.470440 env[1458]: time="2024-02-13T07:52:45.470400406Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 07:52:45.470492 env[1458]: time="2024-02-13T07:52:45.470477820Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/d058bcf52053f976177164c3c384695e2bd8304ce9b5261435f16835493a1649 pid=2256 runtime=io.containerd.runc.v2 Feb 13 07:52:45.474701 systemd[1]: Started cri-containerd-86b39390004d20c42f3f347dea41bc1e427f25b0e3283b5fe6e289bc6f14ddca.scope. Feb 13 07:52:45.478137 systemd[1]: Started cri-containerd-3a52128fd831e5fbd9dbfc296e578d4005f39006c7fd0f20f56c0241c9e3a81b.scope. Feb 13 07:52:45.478961 systemd[1]: Started cri-containerd-d058bcf52053f976177164c3c384695e2bd8304ce9b5261435f16835493a1649.scope. Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit: BPF prog-id=64 op=LOAD Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000145c48 a2=10 a3=1c items=0 ppid=2221 pid=2235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623339333930303034643230633432663366333437646561343162 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001456b0 a2=3c a3=c items=0 ppid=2221 pid=2235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623339333930303034643230633432663366333437646561343162 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.483000 audit: BPF prog-id=65 op=LOAD Feb 13 07:52:45.483000 audit[2235]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001459d8 a2=78 a3=c000331d10 items=0 ppid=2221 pid=2235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623339333930303034643230633432663366333437646561343162 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit: BPF prog-id=66 op=LOAD Feb 13 07:52:45.484000 audit[2235]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000145770 a2=78 a3=c000331d58 items=0 ppid=2221 pid=2235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623339333930303034643230633432663366333437646561343162 Feb 13 07:52:45.484000 audit: BPF prog-id=66 op=UNLOAD Feb 13 07:52:45.484000 audit: BPF prog-id=65 op=UNLOAD Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { perfmon } for pid=2235 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[2235]: AVC avc: denied { bpf } for pid=2235 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit: BPF prog-id=67 op=LOAD Feb 13 07:52:45.484000 audit[2235]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000145c30 a2=78 a3=c0003ae168 items=0 ppid=2221 pid=2235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623339333930303034643230633432663366333437646561343162 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.484000 audit: BPF prog-id=68 op=LOAD Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000195c48 a2=10 a3=1c items=0 ppid=2242 pid=2266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353231323866643833316535666264396462666332393665353738 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001956b0 a2=3c a3=c items=0 ppid=2242 pid=2266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353231323866643833316535666264396462666332393665353738 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit: BPF prog-id=69 op=LOAD Feb 13 07:52:45.485000 audit[2266]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001959d8 a2=78 a3=c000233a50 items=0 ppid=2242 pid=2266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353231323866643833316535666264396462666332393665353738 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit: BPF prog-id=70 op=LOAD Feb 13 07:52:45.485000 audit[2266]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000195770 a2=78 a3=c000233a98 items=0 ppid=2242 pid=2266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353231323866643833316535666264396462666332393665353738 Feb 13 07:52:45.485000 audit: BPF prog-id=70 op=UNLOAD Feb 13 07:52:45.485000 audit: BPF prog-id=69 op=UNLOAD Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { perfmon } for pid=2266 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2266]: AVC avc: denied { bpf } for pid=2266 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit: BPF prog-id=71 op=LOAD Feb 13 07:52:45.485000 audit[2266]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000195c30 a2=78 a3=c000233ea8 items=0 ppid=2242 pid=2266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353231323866643833316535666264396462666332393665353738 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit: BPF prog-id=72 op=LOAD Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c0001bdc48 a2=10 a3=1c items=0 ppid=2256 pid=2271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430353862636635323035336639373631373731363463336333383436 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001bd6b0 a2=3c a3=c items=0 ppid=2256 pid=2271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430353862636635323035336639373631373731363463336333383436 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit: BPF prog-id=73 op=LOAD Feb 13 07:52:45.485000 audit[2271]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bd9d8 a2=78 a3=c000024da0 items=0 ppid=2256 pid=2271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430353862636635323035336639373631373731363463336333383436 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit: BPF prog-id=74 op=LOAD Feb 13 07:52:45.485000 audit[2271]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c0001bd770 a2=78 a3=c000024de8 items=0 ppid=2256 pid=2271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430353862636635323035336639373631373731363463336333383436 Feb 13 07:52:45.485000 audit: BPF prog-id=74 op=UNLOAD Feb 13 07:52:45.485000 audit: BPF prog-id=73 op=UNLOAD Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { perfmon } for pid=2271 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit[2271]: AVC avc: denied { bpf } for pid=2271 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.485000 audit: BPF prog-id=75 op=LOAD Feb 13 07:52:45.485000 audit[2271]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bdc30 a2=78 a3=c0000251f8 items=0 ppid=2256 pid=2271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430353862636635323035336639373631373731363463336333383436 Feb 13 07:52:45.502408 env[1458]: time="2024-02-13T07:52:45.502375102Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-bf0bde3476,Uid:b134222ffd42ce786b8efd601ab30089,Namespace:kube-system,Attempt:0,} returns sandbox id \"86b39390004d20c42f3f347dea41bc1e427f25b0e3283b5fe6e289bc6f14ddca\"" Feb 13 07:52:45.503253 env[1458]: time="2024-02-13T07:52:45.503236014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-bf0bde3476,Uid:7e9cf9721535bab49bc5b91ca08afb11,Namespace:kube-system,Attempt:0,} returns sandbox id \"3a52128fd831e5fbd9dbfc296e578d4005f39006c7fd0f20f56c0241c9e3a81b\"" Feb 13 07:52:45.504218 env[1458]: time="2024-02-13T07:52:45.504199278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-bf0bde3476,Uid:0a87ed2b1942aaccd48ed0dcff5d4971,Namespace:kube-system,Attempt:0,} returns sandbox id \"d058bcf52053f976177164c3c384695e2bd8304ce9b5261435f16835493a1649\"" Feb 13 07:52:45.504252 env[1458]: time="2024-02-13T07:52:45.504223898Z" level=info msg="CreateContainer within sandbox \"86b39390004d20c42f3f347dea41bc1e427f25b0e3283b5fe6e289bc6f14ddca\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 07:52:45.504754 env[1458]: time="2024-02-13T07:52:45.504740681Z" level=info msg="CreateContainer within sandbox \"3a52128fd831e5fbd9dbfc296e578d4005f39006c7fd0f20f56c0241c9e3a81b\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 07:52:45.505378 env[1458]: time="2024-02-13T07:52:45.505367087Z" level=info msg="CreateContainer within sandbox \"d058bcf52053f976177164c3c384695e2bd8304ce9b5261435f16835493a1649\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 07:52:45.510620 env[1458]: time="2024-02-13T07:52:45.510603876Z" level=info msg="CreateContainer within sandbox \"86b39390004d20c42f3f347dea41bc1e427f25b0e3283b5fe6e289bc6f14ddca\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fa08e72783d41a94406a2870f491cc8ae3ab152a4155f8522e6dc1a7faf9d9c2\"" Feb 13 07:52:45.510875 env[1458]: time="2024-02-13T07:52:45.510863699Z" level=info msg="StartContainer for \"fa08e72783d41a94406a2870f491cc8ae3ab152a4155f8522e6dc1a7faf9d9c2\"" Feb 13 07:52:45.512218 env[1458]: time="2024-02-13T07:52:45.512202124Z" level=info msg="CreateContainer within sandbox \"3a52128fd831e5fbd9dbfc296e578d4005f39006c7fd0f20f56c0241c9e3a81b\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"bcc38db6f6f0037cec0b41089609879d8f21e009822a90178dc45046387d2fff\"" Feb 13 07:52:45.512346 env[1458]: time="2024-02-13T07:52:45.512327171Z" level=info msg="CreateContainer within sandbox \"d058bcf52053f976177164c3c384695e2bd8304ce9b5261435f16835493a1649\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"eef301d991610380a05e758c8cdea6f05a2d2a6f4636cb40c42b75f92b18b53b\"" Feb 13 07:52:45.512411 env[1458]: time="2024-02-13T07:52:45.512395273Z" level=info msg="StartContainer for \"bcc38db6f6f0037cec0b41089609879d8f21e009822a90178dc45046387d2fff\"" Feb 13 07:52:45.513517 env[1458]: time="2024-02-13T07:52:45.513503089Z" level=info msg="StartContainer for \"eef301d991610380a05e758c8cdea6f05a2d2a6f4636cb40c42b75f92b18b53b\"" Feb 13 07:52:45.520019 systemd[1]: Started cri-containerd-fa08e72783d41a94406a2870f491cc8ae3ab152a4155f8522e6dc1a7faf9d9c2.scope. Feb 13 07:52:45.522253 systemd[1]: Started cri-containerd-bcc38db6f6f0037cec0b41089609879d8f21e009822a90178dc45046387d2fff.scope. Feb 13 07:52:45.522974 systemd[1]: Started cri-containerd-eef301d991610380a05e758c8cdea6f05a2d2a6f4636cb40c42b75f92b18b53b.scope. Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit: BPF prog-id=76 op=LOAD Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2221 pid=2347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.529000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661303865373237383364343161393434303661323837306634393163 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001976b0 a2=3c a3=8 items=0 ppid=2221 pid=2347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.529000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661303865373237383364343161393434303661323837306634393163 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit: BPF prog-id=77 op=LOAD Feb 13 07:52:45.529000 audit[2347]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c00021f280 items=0 ppid=2221 pid=2347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.529000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661303865373237383364343161393434303661323837306634393163 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit: BPF prog-id=78 op=LOAD Feb 13 07:52:45.529000 audit[2347]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=c00021f2c8 items=0 ppid=2221 pid=2347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.529000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661303865373237383364343161393434303661323837306634393163 Feb 13 07:52:45.529000 audit: BPF prog-id=78 op=UNLOAD Feb 13 07:52:45.529000 audit: BPF prog-id=77 op=UNLOAD Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { perfmon } for pid=2347 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit[2347]: AVC avc: denied { bpf } for pid=2347 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.529000 audit: BPF prog-id=79 op=LOAD Feb 13 07:52:45.529000 audit[2347]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c00021f6d8 items=0 ppid=2221 pid=2347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.529000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661303865373237383364343161393434303661323837306634393163 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit: BPF prog-id=80 op=LOAD Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2256 pid=2363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565663330316439393136313033383061303565373538633863646561 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001976b0 a2=3c a3=8 items=0 ppid=2256 pid=2363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565663330316439393136313033383061303565373538633863646561 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit: BPF prog-id=81 op=LOAD Feb 13 07:52:45.530000 audit[2363]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c000261d10 items=0 ppid=2256 pid=2363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565663330316439393136313033383061303565373538633863646561 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit: BPF prog-id=82 op=LOAD Feb 13 07:52:45.530000 audit[2363]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=c000261d58 items=0 ppid=2256 pid=2363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565663330316439393136313033383061303565373538633863646561 Feb 13 07:52:45.530000 audit: BPF prog-id=82 op=UNLOAD Feb 13 07:52:45.530000 audit: BPF prog-id=81 op=UNLOAD Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { perfmon } for pid=2363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit[2363]: AVC avc: denied { bpf } for pid=2363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.530000 audit: BPF prog-id=83 op=LOAD Feb 13 07:52:45.530000 audit[2363]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c0003e6168 items=0 ppid=2256 pid=2363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6565663330316439393136313033383061303565373538633863646561 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit: BPF prog-id=84 op=LOAD Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c0001bdc48 a2=10 a3=1c items=0 ppid=2242 pid=2359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.531000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263633338646236663666303033376365633062343130383936303938 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001bd6b0 a2=3c a3=8 items=0 ppid=2242 pid=2359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.531000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263633338646236663666303033376365633062343130383936303938 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit: BPF prog-id=85 op=LOAD Feb 13 07:52:45.531000 audit[2359]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bd9d8 a2=78 a3=c000251c60 items=0 ppid=2242 pid=2359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.531000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263633338646236663666303033376365633062343130383936303938 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit: BPF prog-id=86 op=LOAD Feb 13 07:52:45.531000 audit[2359]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c0001bd770 a2=78 a3=c000251ca8 items=0 ppid=2242 pid=2359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.531000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263633338646236663666303033376365633062343130383936303938 Feb 13 07:52:45.531000 audit: BPF prog-id=86 op=UNLOAD Feb 13 07:52:45.531000 audit: BPF prog-id=85 op=UNLOAD Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { perfmon } for pid=2359 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit[2359]: AVC avc: denied { bpf } for pid=2359 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:45.531000 audit: BPF prog-id=87 op=LOAD Feb 13 07:52:45.531000 audit[2359]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bdc30 a2=78 a3=c0003a00b8 items=0 ppid=2242 pid=2359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:45.531000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263633338646236663666303033376365633062343130383936303938 Feb 13 07:52:45.548670 env[1458]: time="2024-02-13T07:52:45.548612973Z" level=info msg="StartContainer for \"fa08e72783d41a94406a2870f491cc8ae3ab152a4155f8522e6dc1a7faf9d9c2\" returns successfully" Feb 13 07:52:45.548858 env[1458]: time="2024-02-13T07:52:45.548816311Z" level=info msg="StartContainer for \"eef301d991610380a05e758c8cdea6f05a2d2a6f4636cb40c42b75f92b18b53b\" returns successfully" Feb 13 07:52:45.549391 env[1458]: time="2024-02-13T07:52:45.549379348Z" level=info msg="StartContainer for \"bcc38db6f6f0037cec0b41089609879d8f21e009822a90178dc45046387d2fff\" returns successfully" Feb 13 07:52:46.003800 kubelet[2172]: I0213 07:52:46.003763 2172 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:46.105000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.133229 kernel: kauditd_printk_skb: 559 callbacks suppressed Feb 13 07:52:46.133337 kernel: audit: type=1400 audit(1707810766.105:683): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.105000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=7 a1=c00048b590 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:46.343152 kernel: audit: type=1300 audit(1707810766.105:683): arch=c000003e syscall=254 success=no exit=-13 a0=7 a1=c00048b590 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:46.343193 kernel: audit: type=1327 audit(1707810766.105:683): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:46.105000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:46.435824 kernel: audit: type=1400 audit(1707810766.105:684): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.105000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.525292 kernel: audit: type=1300 audit(1707810766.105:684): arch=c000003e syscall=254 success=no exit=-13 a0=9 a1=c00015eb60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:46.105000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=9 a1=c00015eb60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:46.644882 kernel: audit: type=1327 audit(1707810766.105:684): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:46.105000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:46.737291 kernel: audit: type=1400 audit(1707810766.178:685): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.178000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.826453 kernel: audit: type=1400 audit(1707810766.178:686): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.178000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.917594 kernel: audit: type=1300 audit(1707810766.178:685): arch=c000003e syscall=254 success=no exit=-13 a0=3f a1=c00477e020 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:52:46.178000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=3f a1=c00477e020 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:52:46.927944 kubelet[2172]: E0213 07:52:46.927925 2172 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-3510.3.2-a-bf0bde3476\" not found" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:46.178000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=3e a1=c0049a28a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:52:47.024863 kubelet[2172]: I0213 07:52:47.024841 2172 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:47.115983 kernel: audit: type=1300 audit(1707810766.178:686): arch=c000003e syscall=254 success=no exit=-13 a0=3e a1=c0049a28a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:52:46.178000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:52:46.178000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:52:46.179000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.179000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=3e a1=c004aeed50 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:52:46.179000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:52:46.919000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.919000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=4a a1=c003bfad50 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:52:46.919000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:52:46.919000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.919000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=73 a1=c004188f00 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:52:46.919000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:52:46.919000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:46.919000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=7a a1=c003bfb1a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:52:46.919000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:52:47.488081 kubelet[2172]: I0213 07:52:47.488001 2172 apiserver.go:52] "Watching apiserver" Feb 13 07:52:47.514656 kubelet[2172]: W0213 07:52:47.514556 2172 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Feb 13 07:52:47.588433 kubelet[2172]: I0213 07:52:47.588382 2172 desired_state_of_world_populator.go:153] "Finished populating initial desired state of world" Feb 13 07:52:47.608417 kubelet[2172]: I0213 07:52:47.608361 2172 reconciler.go:41] "Reconciler: start to sync state" Feb 13 07:52:49.039000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/opt/libexec/kubernetes/kubelet-plugins/volume/exec" dev="sdb9" ino=524848 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:usr_t:s0 tclass=dir permissive=0 Feb 13 07:52:49.039000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=8 a1=c000a5cd80 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:49.039000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:49.399112 systemd[1]: Reloading. Feb 13 07:52:49.457310 /usr/lib/systemd/system-generators/torcx-generator[2511]: time="2024-02-13T07:52:49Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 07:52:49.457336 /usr/lib/systemd/system-generators/torcx-generator[2511]: time="2024-02-13T07:52:49Z" level=info msg="torcx already run" Feb 13 07:52:49.533709 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 07:52:49.533719 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 07:52:49.546736 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.600000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit: BPF prog-id=88 op=LOAD Feb 13 07:52:49.601000 audit: BPF prog-id=80 op=UNLOAD Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit: BPF prog-id=89 op=LOAD Feb 13 07:52:49.601000 audit: BPF prog-id=68 op=UNLOAD Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit: BPF prog-id=90 op=LOAD Feb 13 07:52:49.601000 audit: BPF prog-id=49 op=UNLOAD Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit: BPF prog-id=91 op=LOAD Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.601000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit: BPF prog-id=92 op=LOAD Feb 13 07:52:49.602000 audit: BPF prog-id=50 op=UNLOAD Feb 13 07:52:49.602000 audit: BPF prog-id=51 op=UNLOAD Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit: BPF prog-id=93 op=LOAD Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit: BPF prog-id=94 op=LOAD Feb 13 07:52:49.602000 audit: BPF prog-id=52 op=UNLOAD Feb 13 07:52:49.602000 audit: BPF prog-id=53 op=UNLOAD Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.602000 audit: BPF prog-id=95 op=LOAD Feb 13 07:52:49.602000 audit: BPF prog-id=64 op=UNLOAD Feb 13 07:52:49.602000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit: BPF prog-id=96 op=LOAD Feb 13 07:52:49.603000 audit: BPF prog-id=54 op=UNLOAD Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit: BPF prog-id=97 op=LOAD Feb 13 07:52:49.603000 audit: BPF prog-id=84 op=UNLOAD Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.603000 audit: BPF prog-id=98 op=LOAD Feb 13 07:52:49.603000 audit: BPF prog-id=76 op=UNLOAD Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit: BPF prog-id=99 op=LOAD Feb 13 07:52:49.604000 audit: BPF prog-id=72 op=UNLOAD Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit: BPF prog-id=100 op=LOAD Feb 13 07:52:49.604000 audit: BPF prog-id=55 op=UNLOAD Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.604000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit: BPF prog-id=101 op=LOAD Feb 13 07:52:49.605000 audit: BPF prog-id=56 op=UNLOAD Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit: BPF prog-id=102 op=LOAD Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit: BPF prog-id=103 op=LOAD Feb 13 07:52:49.605000 audit: BPF prog-id=57 op=UNLOAD Feb 13 07:52:49.605000 audit: BPF prog-id=58 op=UNLOAD Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.605000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit: BPF prog-id=104 op=LOAD Feb 13 07:52:49.606000 audit: BPF prog-id=59 op=UNLOAD Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit: BPF prog-id=105 op=LOAD Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit: BPF prog-id=106 op=LOAD Feb 13 07:52:49.606000 audit: BPF prog-id=60 op=UNLOAD Feb 13 07:52:49.606000 audit: BPF prog-id=61 op=UNLOAD Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.606000 audit: BPF prog-id=107 op=LOAD Feb 13 07:52:49.606000 audit: BPF prog-id=62 op=UNLOAD Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.608000 audit: BPF prog-id=108 op=LOAD Feb 13 07:52:49.608000 audit: BPF prog-id=63 op=UNLOAD Feb 13 07:52:49.615974 systemd[1]: Stopping kubelet.service... Feb 13 07:52:49.640199 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 07:52:49.640554 systemd[1]: Stopped kubelet.service. Feb 13 07:52:49.639000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:49.643934 systemd[1]: Started kubelet.service. Feb 13 07:52:49.643000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:49.670455 kubelet[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 07:52:49.670455 kubelet[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 07:52:49.670455 kubelet[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 07:52:49.670455 kubelet[2569]: I0213 07:52:49.670440 2569 server.go:199] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 07:52:49.673027 kubelet[2569]: I0213 07:52:49.673009 2569 server.go:415] "Kubelet version" kubeletVersion="v1.27.2" Feb 13 07:52:49.673027 kubelet[2569]: I0213 07:52:49.673027 2569 server.go:417] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 07:52:49.673223 kubelet[2569]: I0213 07:52:49.673215 2569 server.go:837] "Client rotation is on, will bootstrap in background" Feb 13 07:52:49.674928 kubelet[2569]: I0213 07:52:49.674913 2569 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 07:52:49.676658 kubelet[2569]: I0213 07:52:49.676639 2569 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 07:52:49.693110 kubelet[2569]: I0213 07:52:49.693068 2569 server.go:662] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 07:52:49.693189 kubelet[2569]: I0213 07:52:49.693169 2569 container_manager_linux.go:266] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 07:52:49.693216 kubelet[2569]: I0213 07:52:49.693211 2569 container_manager_linux.go:271] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:systemd KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:} {Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:} {Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] TopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] PodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms TopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 13 07:52:49.693271 kubelet[2569]: I0213 07:52:49.693221 2569 topology_manager.go:136] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 13 07:52:49.693271 kubelet[2569]: I0213 07:52:49.693227 2569 container_manager_linux.go:302] "Creating device plugin manager" Feb 13 07:52:49.693271 kubelet[2569]: I0213 07:52:49.693243 2569 state_mem.go:36] "Initialized new in-memory state store" Feb 13 07:52:49.694606 kubelet[2569]: I0213 07:52:49.694598 2569 kubelet.go:405] "Attempting to sync node with API server" Feb 13 07:52:49.694606 kubelet[2569]: I0213 07:52:49.694608 2569 kubelet.go:298] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 07:52:49.694664 kubelet[2569]: I0213 07:52:49.694618 2569 kubelet.go:309] "Adding apiserver pod source" Feb 13 07:52:49.694664 kubelet[2569]: I0213 07:52:49.694626 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 07:52:49.695581 kubelet[2569]: I0213 07:52:49.695553 2569 kuberuntime_manager.go:257] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 13 07:52:49.696088 kubelet[2569]: I0213 07:52:49.696079 2569 server.go:1168] "Started kubelet" Feb 13 07:52:49.696128 kubelet[2569]: I0213 07:52:49.696116 2569 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 07:52:49.696378 kubelet[2569]: I0213 07:52:49.696368 2569 ratelimit.go:65] "Setting rate limiting for podresources endpoint" qps=100 burstTokens=10 Feb 13 07:52:49.696502 kubelet[2569]: E0213 07:52:49.696483 2569 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 13 07:52:49.696538 kubelet[2569]: E0213 07:52:49.696515 2569 kubelet.go:1400] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 07:52:49.697468 kubelet[2569]: I0213 07:52:49.697460 2569 server.go:461] "Adding debug handlers to kubelet server" Feb 13 07:52:49.696000 audit[2569]: AVC avc: denied { mac_admin } for pid=2569 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.696000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 07:52:49.696000 audit[2569]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0011b6420 a1=c0011a03c0 a2=c0011b63f0 a3=25 items=0 ppid=1 pid=2569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:49.696000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 07:52:49.696000 audit[2569]: AVC avc: denied { mac_admin } for pid=2569 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.696000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 07:52:49.696000 audit[2569]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000ec2480 a1=c0011a03d8 a2=c0011b64b0 a3=25 items=0 ppid=1 pid=2569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:49.696000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 07:52:49.697893 kubelet[2569]: I0213 07:52:49.697551 2569 kubelet.go:1355] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 13 07:52:49.697893 kubelet[2569]: I0213 07:52:49.697573 2569 kubelet.go:1359] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 13 07:52:49.697893 kubelet[2569]: I0213 07:52:49.697587 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 07:52:49.697893 kubelet[2569]: I0213 07:52:49.697618 2569 volume_manager.go:284] "Starting Kubelet Volume Manager" Feb 13 07:52:49.697893 kubelet[2569]: E0213 07:52:49.697641 2569 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-bf0bde3476\" not found" Feb 13 07:52:49.697893 kubelet[2569]: I0213 07:52:49.697672 2569 desired_state_of_world_populator.go:145] "Desired state populator starts to run" Feb 13 07:52:49.703979 kubelet[2569]: I0213 07:52:49.703958 2569 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 13 07:52:49.704491 kubelet[2569]: I0213 07:52:49.704480 2569 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 13 07:52:49.704536 kubelet[2569]: I0213 07:52:49.704500 2569 status_manager.go:207] "Starting to sync pod status with apiserver" Feb 13 07:52:49.704536 kubelet[2569]: I0213 07:52:49.704512 2569 kubelet.go:2257] "Starting kubelet main sync loop" Feb 13 07:52:49.704589 kubelet[2569]: E0213 07:52:49.704545 2569 kubelet.go:2281] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 07:52:49.717725 kubelet[2569]: I0213 07:52:49.717705 2569 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 07:52:49.717861 kubelet[2569]: I0213 07:52:49.717751 2569 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 07:52:49.717861 kubelet[2569]: I0213 07:52:49.717773 2569 state_mem.go:36] "Initialized new in-memory state store" Feb 13 07:52:49.718018 kubelet[2569]: I0213 07:52:49.718006 2569 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 07:52:49.718062 kubelet[2569]: I0213 07:52:49.718027 2569 state_mem.go:96] "Updated CPUSet assignments" assignments=map[] Feb 13 07:52:49.718062 kubelet[2569]: I0213 07:52:49.718035 2569 policy_none.go:49] "None policy: Start" Feb 13 07:52:49.718484 kubelet[2569]: I0213 07:52:49.718470 2569 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 13 07:52:49.718515 kubelet[2569]: I0213 07:52:49.718488 2569 state_mem.go:35] "Initializing new in-memory state store" Feb 13 07:52:49.718810 kubelet[2569]: I0213 07:52:49.718659 2569 state_mem.go:75] "Updated machine memory state" Feb 13 07:52:49.720625 kubelet[2569]: I0213 07:52:49.720615 2569 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 07:52:49.719000 audit[2569]: AVC avc: denied { mac_admin } for pid=2569 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:52:49.719000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 07:52:49.719000 audit[2569]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0002d6630 a1=c000676420 a2=c0002d6600 a3=25 items=0 ppid=1 pid=2569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:52:49.719000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 07:52:49.720791 kubelet[2569]: I0213 07:52:49.720656 2569 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 13 07:52:49.720791 kubelet[2569]: I0213 07:52:49.720766 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 07:52:49.799812 kubelet[2569]: I0213 07:52:49.799765 2569 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:49.804927 kubelet[2569]: I0213 07:52:49.804916 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:52:49.804927 kubelet[2569]: I0213 07:52:49.804926 2569 kubelet_node_status.go:108] "Node was previously registered" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:49.804996 kubelet[2569]: I0213 07:52:49.804971 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:52:49.804996 kubelet[2569]: I0213 07:52:49.804974 2569 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:49.804996 kubelet[2569]: I0213 07:52:49.804991 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:52:49.807455 kubelet[2569]: W0213 07:52:49.807442 2569 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Feb 13 07:52:49.808411 kubelet[2569]: W0213 07:52:49.808402 2569 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Feb 13 07:52:49.809107 kubelet[2569]: W0213 07:52:49.809072 2569 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Feb 13 07:52:49.809107 kubelet[2569]: E0213 07:52:49.809099 2569 kubelet.go:1856] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-bf0bde3476\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008000 kubelet[2569]: I0213 07:52:50.007915 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0a87ed2b1942aaccd48ed0dcff5d4971-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-bf0bde3476\" (UID: \"0a87ed2b1942aaccd48ed0dcff5d4971\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008000 kubelet[2569]: I0213 07:52:50.007946 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008000 kubelet[2569]: I0213 07:52:50.007961 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008000 kubelet[2569]: I0213 07:52:50.007973 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008000 kubelet[2569]: I0213 07:52:50.007995 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b134222ffd42ce786b8efd601ab30089-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-bf0bde3476\" (UID: \"b134222ffd42ce786b8efd601ab30089\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008167 kubelet[2569]: I0213 07:52:50.008021 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0a87ed2b1942aaccd48ed0dcff5d4971-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-bf0bde3476\" (UID: \"0a87ed2b1942aaccd48ed0dcff5d4971\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008167 kubelet[2569]: I0213 07:52:50.008045 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0a87ed2b1942aaccd48ed0dcff5d4971-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-bf0bde3476\" (UID: \"0a87ed2b1942aaccd48ed0dcff5d4971\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008167 kubelet[2569]: I0213 07:52:50.008069 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.008167 kubelet[2569]: I0213 07:52:50.008096 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7e9cf9721535bab49bc5b91ca08afb11-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-bf0bde3476\" (UID: \"7e9cf9721535bab49bc5b91ca08afb11\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.695334 kubelet[2569]: I0213 07:52:50.695227 2569 apiserver.go:52] "Watching apiserver" Feb 13 07:52:50.717362 kubelet[2569]: W0213 07:52:50.717274 2569 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Feb 13 07:52:50.717622 kubelet[2569]: E0213 07:52:50.717412 2569 kubelet.go:1856] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-bf0bde3476\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" Feb 13 07:52:50.748476 kubelet[2569]: I0213 07:52:50.748430 2569 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-3510.3.2-a-bf0bde3476" podStartSLOduration=1.7483594839999999 podCreationTimestamp="2024-02-13 07:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 07:52:50.748353218 +0000 UTC m=+1.099404951" watchObservedRunningTime="2024-02-13 07:52:50.748359484 +0000 UTC m=+1.099411206" Feb 13 07:52:50.764260 kubelet[2569]: I0213 07:52:50.764200 2569 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-3510.3.2-a-bf0bde3476" podStartSLOduration=3.764155101 podCreationTimestamp="2024-02-13 07:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 07:52:50.764084745 +0000 UTC m=+1.115136469" watchObservedRunningTime="2024-02-13 07:52:50.764155101 +0000 UTC m=+1.115206827" Feb 13 07:52:50.764405 kubelet[2569]: I0213 07:52:50.764306 2569 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-bf0bde3476" podStartSLOduration=1.764278564 podCreationTimestamp="2024-02-13 07:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 07:52:50.756650976 +0000 UTC m=+1.107702696" watchObservedRunningTime="2024-02-13 07:52:50.764278564 +0000 UTC m=+1.115330277" Feb 13 07:52:50.798509 kubelet[2569]: I0213 07:52:50.798444 2569 desired_state_of_world_populator.go:153] "Finished populating initial desired state of world" Feb 13 07:52:50.811384 kubelet[2569]: I0213 07:52:50.811289 2569 reconciler.go:41] "Reconciler: start to sync state" Feb 13 07:52:50.857000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:50.857000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0010a1140 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:50.857000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:50.860000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:50.860000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0002b7780 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:50.860000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:50.863000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:50.863000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114e7e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:50.863000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:50.866000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:52:50.866000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114e860 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:52:50.866000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:52:55.500091 sudo[1615]: pam_unix(sudo:session): session closed for user root Feb 13 07:52:55.499000 audit[1615]: USER_END pid=1615 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:55.500942 sshd[1612]: pam_unix(sshd:session): session closed for user core Feb 13 07:52:55.502399 systemd[1]: sshd@6-145.40.90.207:22-139.178.68.195:51368.service: Deactivated successfully. Feb 13 07:52:55.502863 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 07:52:55.502956 systemd[1]: session-9.scope: Consumed 3.033s CPU time. Feb 13 07:52:55.503312 systemd-logind[1446]: Session 9 logged out. Waiting for processes to exit. Feb 13 07:52:55.503956 systemd-logind[1446]: Removed session 9. Feb 13 07:52:55.527194 kernel: kauditd_printk_skb: 287 callbacks suppressed Feb 13 07:52:55.527233 kernel: audit: type=1106 audit(1707810775.499:945): pid=1615 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:55.499000 audit[1615]: CRED_DISP pid=1615 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:55.705214 kernel: audit: type=1104 audit(1707810775.499:946): pid=1615 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 07:52:55.705263 kernel: audit: type=1106 audit(1707810775.500:947): pid=1612 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:55.500000 audit[1612]: USER_END pid=1612 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:55.801626 kernel: audit: type=1104 audit(1707810775.501:948): pid=1612 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:55.501000 audit[1612]: CRED_DISP pid=1612 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 07:52:55.501000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.90.207:22-139.178.68.195:51368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:52:55.892689 kernel: audit: type=1131 audit(1707810775.501:949): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.90.207:22-139.178.68.195:51368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:53:01.938435 kubelet[2569]: I0213 07:53:01.938391 2569 kuberuntime_manager.go:1460] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 07:53:01.938786 env[1458]: time="2024-02-13T07:53:01.938691413Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 07:53:01.938984 kubelet[2569]: I0213 07:53:01.938864 2569 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 07:53:02.787088 kubelet[2569]: I0213 07:53:02.787020 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:53:02.800368 systemd[1]: Created slice kubepods-besteffort-pod8840856d_6896_411d_9026_08a92ba8cd5a.slice. Feb 13 07:53:02.846088 kubelet[2569]: I0213 07:53:02.846057 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:53:02.853837 systemd[1]: Created slice kubepods-besteffort-podbde7ebc8_2894_4ea6_89d3_a685c1681ec2.slice. Feb 13 07:53:02.884880 kubelet[2569]: I0213 07:53:02.884826 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8840856d-6896-411d-9026-08a92ba8cd5a-xtables-lock\") pod \"kube-proxy-2cjzx\" (UID: \"8840856d-6896-411d-9026-08a92ba8cd5a\") " pod="kube-system/kube-proxy-2cjzx" Feb 13 07:53:02.885297 kubelet[2569]: I0213 07:53:02.884933 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/bde7ebc8-2894-4ea6-89d3-a685c1681ec2-var-lib-calico\") pod \"tigera-operator-7ff8dc855-gmcwp\" (UID: \"bde7ebc8-2894-4ea6-89d3-a685c1681ec2\") " pod="tigera-operator/tigera-operator-7ff8dc855-gmcwp" Feb 13 07:53:02.885297 kubelet[2569]: I0213 07:53:02.885003 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6vh\" (UniqueName: \"kubernetes.io/projected/bde7ebc8-2894-4ea6-89d3-a685c1681ec2-kube-api-access-qk6vh\") pod \"tigera-operator-7ff8dc855-gmcwp\" (UID: \"bde7ebc8-2894-4ea6-89d3-a685c1681ec2\") " pod="tigera-operator/tigera-operator-7ff8dc855-gmcwp" Feb 13 07:53:02.885297 kubelet[2569]: I0213 07:53:02.885065 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8840856d-6896-411d-9026-08a92ba8cd5a-kube-proxy\") pod \"kube-proxy-2cjzx\" (UID: \"8840856d-6896-411d-9026-08a92ba8cd5a\") " pod="kube-system/kube-proxy-2cjzx" Feb 13 07:53:02.885681 kubelet[2569]: I0213 07:53:02.885372 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xgxr\" (UniqueName: \"kubernetes.io/projected/8840856d-6896-411d-9026-08a92ba8cd5a-kube-api-access-5xgxr\") pod \"kube-proxy-2cjzx\" (UID: \"8840856d-6896-411d-9026-08a92ba8cd5a\") " pod="kube-system/kube-proxy-2cjzx" Feb 13 07:53:02.885681 kubelet[2569]: I0213 07:53:02.885532 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8840856d-6896-411d-9026-08a92ba8cd5a-lib-modules\") pod \"kube-proxy-2cjzx\" (UID: \"8840856d-6896-411d-9026-08a92ba8cd5a\") " pod="kube-system/kube-proxy-2cjzx" Feb 13 07:53:03.125219 env[1458]: time="2024-02-13T07:53:03.124986475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-2cjzx,Uid:8840856d-6896-411d-9026-08a92ba8cd5a,Namespace:kube-system,Attempt:0,}" Feb 13 07:53:03.151439 env[1458]: time="2024-02-13T07:53:03.151211383Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 07:53:03.151439 env[1458]: time="2024-02-13T07:53:03.151306466Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 07:53:03.151439 env[1458]: time="2024-02-13T07:53:03.151343331Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 07:53:03.152046 env[1458]: time="2024-02-13T07:53:03.151836265Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/9ba9eced023b5f310842c2e9b5bfbbd44a046ceba1b4043d31b6bfd38c4efeae pid=2730 runtime=io.containerd.runc.v2 Feb 13 07:53:03.160511 env[1458]: time="2024-02-13T07:53:03.160408278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7ff8dc855-gmcwp,Uid:bde7ebc8-2894-4ea6-89d3-a685c1681ec2,Namespace:tigera-operator,Attempt:0,}" Feb 13 07:53:03.183649 env[1458]: time="2024-02-13T07:53:03.183426713Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 07:53:03.183649 env[1458]: time="2024-02-13T07:53:03.183545468Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 07:53:03.183649 env[1458]: time="2024-02-13T07:53:03.183584555Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 07:53:03.184161 env[1458]: time="2024-02-13T07:53:03.184023761Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/6f1c235cb066f321070d62e41eb3613be9267d40bcf99ef77ae7e6ef1e5202c6 pid=2758 runtime=io.containerd.runc.v2 Feb 13 07:53:03.186732 systemd[1]: Started cri-containerd-9ba9eced023b5f310842c2e9b5bfbbd44a046ceba1b4043d31b6bfd38c4efeae.scope. Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.327809 kernel: audit: type=1400 audit(1707810783.197:950): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.327872 kernel: audit: type=1400 audit(1707810783.197:951): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.327908 kernel: audit: type=1400 audit(1707810783.197:952): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.329179 systemd[1]: Started cri-containerd-6f1c235cb066f321070d62e41eb3613be9267d40bcf99ef77ae7e6ef1e5202c6.scope. Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.456349 kernel: audit: type=1400 audit(1707810783.197:953): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.456403 kernel: audit: type=1400 audit(1707810783.197:954): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.585477 kernel: audit: type=1400 audit(1707810783.197:955): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.585532 kernel: audit: type=1400 audit(1707810783.197:956): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.650041 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 07:53:03.650093 kernel: audit: audit_lost=1 audit_rate_limit=0 audit_backlog_limit=64 Feb 13 07:53:03.650128 kernel: audit: backlog limit exceeded Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.197000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit: BPF prog-id=109 op=LOAD Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2730 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962613965636564303233623566333130383432633265396235626662 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001976b0 a2=3c a3=c items=0 ppid=2730 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962613965636564303233623566333130383432633265396235626662 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.331000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.326000 audit: BPF prog-id=110 op=LOAD Feb 13 07:53:03.326000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001979d8 a2=78 a3=c0001dd550 items=0 ppid=2730 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962613965636564303233623566333130383432633265396235626662 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit: BPF prog-id=111 op=LOAD Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000147c48 a2=10 a3=1c items=0 ppid=2758 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666316332333563623036366633323130373064363265343165623336 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001476b0 a2=3c a3=c items=0 ppid=2758 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666316332333563623036366633323130373064363265343165623336 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.454000 audit: BPF prog-id=112 op=LOAD Feb 13 07:53:03.454000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000197770 a2=78 a3=c0001dd598 items=0 ppid=2730 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962613965636564303233623566333130383432633265396235626662 Feb 13 07:53:03.729000 audit: BPF prog-id=112 op=UNLOAD Feb 13 07:53:03.729000 audit: BPF prog-id=110 op=UNLOAD Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.519000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001479d8 a2=78 a3=c0003a41f0 items=0 ppid=2758 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666316332333563623036366633323130373064363265343165623336 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { perfmon } for pid=2741 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit: BPF prog-id=114 op=LOAD Feb 13 07:53:03.729000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000147770 a2=78 a3=c0003a4238 items=0 ppid=2758 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666316332333563623036366633323130373064363265343165623336 Feb 13 07:53:03.729000 audit: BPF prog-id=114 op=UNLOAD Feb 13 07:53:03.729000 audit: BPF prog-id=113 op=UNLOAD Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { perfmon } for pid=2767 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit[2741]: AVC avc: denied { bpf } for pid=2741 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit: BPF prog-id=115 op=LOAD Feb 13 07:53:03.729000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000197c30 a2=78 a3=c0001dd9a8 items=0 ppid=2730 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962613965636564303233623566333130383432633265396235626662 Feb 13 07:53:03.729000 audit[2767]: AVC avc: denied { bpf } for pid=2767 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.729000 audit: BPF prog-id=116 op=LOAD Feb 13 07:53:03.729000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000147c30 a2=78 a3=c0003a4648 items=0 ppid=2758 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666316332333563623036366633323130373064363265343165623336 Feb 13 07:53:03.735254 env[1458]: time="2024-02-13T07:53:03.735230998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-2cjzx,Uid:8840856d-6896-411d-9026-08a92ba8cd5a,Namespace:kube-system,Attempt:0,} returns sandbox id \"9ba9eced023b5f310842c2e9b5bfbbd44a046ceba1b4043d31b6bfd38c4efeae\"" Feb 13 07:53:03.736393 env[1458]: time="2024-02-13T07:53:03.736374865Z" level=info msg="CreateContainer within sandbox \"9ba9eced023b5f310842c2e9b5bfbbd44a046ceba1b4043d31b6bfd38c4efeae\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 07:53:03.741695 env[1458]: time="2024-02-13T07:53:03.741619151Z" level=info msg="CreateContainer within sandbox \"9ba9eced023b5f310842c2e9b5bfbbd44a046ceba1b4043d31b6bfd38c4efeae\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"97dd01585478d45466b9d9c14ba6c956160a32d710fd7e2704e5a05031489a76\"" Feb 13 07:53:03.741938 env[1458]: time="2024-02-13T07:53:03.741919124Z" level=info msg="StartContainer for \"97dd01585478d45466b9d9c14ba6c956160a32d710fd7e2704e5a05031489a76\"" Feb 13 07:53:03.748036 env[1458]: time="2024-02-13T07:53:03.747991949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7ff8dc855-gmcwp,Uid:bde7ebc8-2894-4ea6-89d3-a685c1681ec2,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"6f1c235cb066f321070d62e41eb3613be9267d40bcf99ef77ae7e6ef1e5202c6\"" Feb 13 07:53:03.749098 env[1458]: time="2024-02-13T07:53:03.749080384Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\"" Feb 13 07:53:03.749937 systemd[1]: Started cri-containerd-97dd01585478d45466b9d9c14ba6c956160a32d710fd7e2704e5a05031489a76.scope. Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001bd6b0 a2=3c a3=7f59f275b008 items=0 ppid=2730 pid=2805 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937646430313538353437386434353436366239643963313462613663 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.756000 audit: BPF prog-id=117 op=LOAD Feb 13 07:53:03.756000 audit[2805]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001bd9d8 a2=78 a3=c0003d0118 items=0 ppid=2730 pid=2805 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937646430313538353437386434353436366239643963313462613663 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit: BPF prog-id=118 op=LOAD Feb 13 07:53:03.757000 audit[2805]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c0001bd770 a2=78 a3=c0003d0168 items=0 ppid=2730 pid=2805 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937646430313538353437386434353436366239643963313462613663 Feb 13 07:53:03.757000 audit: BPF prog-id=118 op=UNLOAD Feb 13 07:53:03.757000 audit: BPF prog-id=117 op=UNLOAD Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { perfmon } for pid=2805 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit[2805]: AVC avc: denied { bpf } for pid=2805 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:03.757000 audit: BPF prog-id=119 op=LOAD Feb 13 07:53:03.757000 audit[2805]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001bdc30 a2=78 a3=c0003d01f8 items=0 ppid=2730 pid=2805 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937646430313538353437386434353436366239643963313462613663 Feb 13 07:53:03.764282 env[1458]: time="2024-02-13T07:53:03.764232061Z" level=info msg="StartContainer for \"97dd01585478d45466b9d9c14ba6c956160a32d710fd7e2704e5a05031489a76\" returns successfully" Feb 13 07:53:03.791000 audit[2872]: NETFILTER_CFG table=mangle:38 family=2 entries=1 op=nft_register_chain pid=2872 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.791000 audit[2872]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd5b065300 a2=0 a3=7ffd5b0652ec items=0 ppid=2822 pid=2872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.791000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 07:53:03.791000 audit[2873]: NETFILTER_CFG table=mangle:39 family=10 entries=1 op=nft_register_chain pid=2873 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:03.791000 audit[2873]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffced4ccc40 a2=0 a3=7ffced4ccc2c items=0 ppid=2822 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.791000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 07:53:03.791000 audit[2874]: NETFILTER_CFG table=nat:40 family=2 entries=1 op=nft_register_chain pid=2874 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.791000 audit[2874]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcb53daed0 a2=0 a3=7ffcb53daebc items=0 ppid=2822 pid=2874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.791000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 07:53:03.792000 audit[2875]: NETFILTER_CFG table=nat:41 family=10 entries=1 op=nft_register_chain pid=2875 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:03.792000 audit[2875]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc4e8d3b90 a2=0 a3=7ffc4e8d3b7c items=0 ppid=2822 pid=2875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.792000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 07:53:03.792000 audit[2876]: NETFILTER_CFG table=filter:42 family=2 entries=1 op=nft_register_chain pid=2876 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.792000 audit[2876]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffec2ad3ac0 a2=0 a3=7ffec2ad3aac items=0 ppid=2822 pid=2876 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.792000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 13 07:53:03.792000 audit[2877]: NETFILTER_CFG table=filter:43 family=10 entries=1 op=nft_register_chain pid=2877 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:03.792000 audit[2877]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff63481690 a2=0 a3=7fff6348167c items=0 ppid=2822 pid=2877 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.792000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 13 07:53:03.896000 audit[2878]: NETFILTER_CFG table=filter:44 family=2 entries=1 op=nft_register_chain pid=2878 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.896000 audit[2878]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe57599e10 a2=0 a3=7ffe57599dfc items=0 ppid=2822 pid=2878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.896000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 13 07:53:03.903000 audit[2880]: NETFILTER_CFG table=filter:45 family=2 entries=1 op=nft_register_rule pid=2880 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.903000 audit[2880]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd7177c600 a2=0 a3=7ffd7177c5ec items=0 ppid=2822 pid=2880 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.903000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Feb 13 07:53:03.913000 audit[2883]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2883 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.913000 audit[2883]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc17b37d70 a2=0 a3=7ffc17b37d5c items=0 ppid=2822 pid=2883 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.913000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Feb 13 07:53:03.915000 audit[2884]: NETFILTER_CFG table=filter:47 family=2 entries=1 op=nft_register_chain pid=2884 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.915000 audit[2884]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffda145450 a2=0 a3=7fffda14543c items=0 ppid=2822 pid=2884 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.915000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 13 07:53:03.921000 audit[2886]: NETFILTER_CFG table=filter:48 family=2 entries=1 op=nft_register_rule pid=2886 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.921000 audit[2886]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffeb60872c0 a2=0 a3=7ffeb60872ac items=0 ppid=2822 pid=2886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.921000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 13 07:53:03.924000 audit[2887]: NETFILTER_CFG table=filter:49 family=2 entries=1 op=nft_register_chain pid=2887 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.924000 audit[2887]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffdc471060 a2=0 a3=7fffdc47104c items=0 ppid=2822 pid=2887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.924000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 13 07:53:03.931000 audit[2889]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_rule pid=2889 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.931000 audit[2889]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffdebb16680 a2=0 a3=7ffdebb1666c items=0 ppid=2822 pid=2889 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.931000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 13 07:53:03.940000 audit[2892]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_rule pid=2892 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.940000 audit[2892]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd7a10ff30 a2=0 a3=7ffd7a10ff1c items=0 ppid=2822 pid=2892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.940000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Feb 13 07:53:03.942000 audit[2893]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2893 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.942000 audit[2893]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe898ce6e0 a2=0 a3=7ffe898ce6cc items=0 ppid=2822 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.942000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 13 07:53:03.949000 audit[2895]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_rule pid=2895 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.949000 audit[2895]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcb44d1220 a2=0 a3=7ffcb44d120c items=0 ppid=2822 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.949000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 13 07:53:03.951000 audit[2896]: NETFILTER_CFG table=filter:54 family=2 entries=1 op=nft_register_chain pid=2896 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.951000 audit[2896]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff08caa9e0 a2=0 a3=7fff08caa9cc items=0 ppid=2822 pid=2896 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.951000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 13 07:53:03.958000 audit[2898]: NETFILTER_CFG table=filter:55 family=2 entries=1 op=nft_register_rule pid=2898 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.958000 audit[2898]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe0a337e80 a2=0 a3=7ffe0a337e6c items=0 ppid=2822 pid=2898 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.958000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 07:53:03.967000 audit[2901]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_rule pid=2901 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.967000 audit[2901]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffda0927af0 a2=0 a3=7ffda0927adc items=0 ppid=2822 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.967000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 07:53:03.976000 audit[2904]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_rule pid=2904 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.976000 audit[2904]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc5650c690 a2=0 a3=7ffc5650c67c items=0 ppid=2822 pid=2904 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.976000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 13 07:53:03.979000 audit[2905]: NETFILTER_CFG table=nat:58 family=2 entries=1 op=nft_register_chain pid=2905 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.979000 audit[2905]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffeeec2b490 a2=0 a3=7ffeeec2b47c items=0 ppid=2822 pid=2905 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.979000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 13 07:53:03.985000 audit[2907]: NETFILTER_CFG table=nat:59 family=2 entries=1 op=nft_register_rule pid=2907 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.985000 audit[2907]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc8ac0c580 a2=0 a3=7ffc8ac0c56c items=0 ppid=2822 pid=2907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.985000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 07:53:03.994000 audit[2910]: NETFILTER_CFG table=nat:60 family=2 entries=1 op=nft_register_rule pid=2910 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:03.994000 audit[2910]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcd8ced6e0 a2=0 a3=7ffcd8ced6cc items=0 ppid=2822 pid=2910 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:03.994000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 07:53:04.009000 audit[2915]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=2915 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:04.009000 audit[2915]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd03542930 a2=0 a3=7ffd0354291c items=0 ppid=2822 pid=2915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.009000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 13 07:53:04.015000 audit[2917]: NETFILTER_CFG table=nat:62 family=2 entries=1 op=nft_register_rule pid=2917 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 07:53:04.015000 audit[2917]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffc57c1d820 a2=0 a3=7ffc57c1d80c items=0 ppid=2822 pid=2917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.015000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 13 07:53:04.048000 audit[2919]: NETFILTER_CFG table=filter:63 family=2 entries=8 op=nft_register_rule pid=2919 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:04.048000 audit[2919]: SYSCALL arch=c000003e syscall=46 success=yes exit=4956 a0=3 a1=7ffe7de3f5d0 a2=0 a3=7ffe7de3f5bc items=0 ppid=2822 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.048000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:04.066000 audit[2919]: NETFILTER_CFG table=nat:64 family=2 entries=14 op=nft_register_chain pid=2919 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:04.066000 audit[2919]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffe7de3f5d0 a2=0 a3=7ffe7de3f5bc items=0 ppid=2822 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.066000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:04.069000 audit[2925]: NETFILTER_CFG table=filter:65 family=10 entries=1 op=nft_register_chain pid=2925 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.069000 audit[2925]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffea5d3d210 a2=0 a3=7ffea5d3d1fc items=0 ppid=2822 pid=2925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.069000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 13 07:53:04.075000 audit[2927]: NETFILTER_CFG table=filter:66 family=10 entries=2 op=nft_register_chain pid=2927 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.075000 audit[2927]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7fff7e590020 a2=0 a3=7fff7e59000c items=0 ppid=2822 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.075000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Feb 13 07:53:04.085000 audit[2930]: NETFILTER_CFG table=filter:67 family=10 entries=2 op=nft_register_chain pid=2930 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.085000 audit[2930]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffc80f57510 a2=0 a3=7ffc80f574fc items=0 ppid=2822 pid=2930 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.085000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Feb 13 07:53:04.088000 audit[2931]: NETFILTER_CFG table=filter:68 family=10 entries=1 op=nft_register_chain pid=2931 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.088000 audit[2931]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc839f5020 a2=0 a3=7ffc839f500c items=0 ppid=2822 pid=2931 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.088000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 13 07:53:04.094000 audit[2933]: NETFILTER_CFG table=filter:69 family=10 entries=1 op=nft_register_rule pid=2933 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.094000 audit[2933]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff8c810df0 a2=0 a3=7fff8c810ddc items=0 ppid=2822 pid=2933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.094000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 13 07:53:04.096000 audit[2934]: NETFILTER_CFG table=filter:70 family=10 entries=1 op=nft_register_chain pid=2934 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.096000 audit[2934]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffb9fd05d0 a2=0 a3=7fffb9fd05bc items=0 ppid=2822 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.096000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 13 07:53:04.103000 audit[2936]: NETFILTER_CFG table=filter:71 family=10 entries=1 op=nft_register_rule pid=2936 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.103000 audit[2936]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffcb9bbd4c0 a2=0 a3=7ffcb9bbd4ac items=0 ppid=2822 pid=2936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.103000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Feb 13 07:53:04.112000 audit[2939]: NETFILTER_CFG table=filter:72 family=10 entries=2 op=nft_register_chain pid=2939 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.112000 audit[2939]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7fff2fcf0cc0 a2=0 a3=7fff2fcf0cac items=0 ppid=2822 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.112000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 13 07:53:04.114000 audit[2940]: NETFILTER_CFG table=filter:73 family=10 entries=1 op=nft_register_chain pid=2940 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.114000 audit[2940]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdbff4d1d0 a2=0 a3=7ffdbff4d1bc items=0 ppid=2822 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.114000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 13 07:53:04.121000 audit[2942]: NETFILTER_CFG table=filter:74 family=10 entries=1 op=nft_register_rule pid=2942 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.121000 audit[2942]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffdc471c40 a2=0 a3=7fffdc471c2c items=0 ppid=2822 pid=2942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.121000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 13 07:53:04.124000 audit[2943]: NETFILTER_CFG table=filter:75 family=10 entries=1 op=nft_register_chain pid=2943 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.124000 audit[2943]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff6c28aca0 a2=0 a3=7fff6c28ac8c items=0 ppid=2822 pid=2943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.124000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 13 07:53:04.130000 audit[2945]: NETFILTER_CFG table=filter:76 family=10 entries=1 op=nft_register_rule pid=2945 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.130000 audit[2945]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe0ccf5a00 a2=0 a3=7ffe0ccf59ec items=0 ppid=2822 pid=2945 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.130000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 07:53:04.139000 audit[2948]: NETFILTER_CFG table=filter:77 family=10 entries=1 op=nft_register_rule pid=2948 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.139000 audit[2948]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc6132be30 a2=0 a3=7ffc6132be1c items=0 ppid=2822 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.139000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 13 07:53:04.148000 audit[2951]: NETFILTER_CFG table=filter:78 family=10 entries=1 op=nft_register_rule pid=2951 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.148000 audit[2951]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffeda077bc0 a2=0 a3=7ffeda077bac items=0 ppid=2822 pid=2951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.148000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Feb 13 07:53:04.151000 audit[2952]: NETFILTER_CFG table=nat:79 family=10 entries=1 op=nft_register_chain pid=2952 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.151000 audit[2952]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc4928e2f0 a2=0 a3=7ffc4928e2dc items=0 ppid=2822 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.151000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 13 07:53:04.156000 audit[2954]: NETFILTER_CFG table=nat:80 family=10 entries=2 op=nft_register_chain pid=2954 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.156000 audit[2954]: SYSCALL arch=c000003e syscall=46 success=yes exit=600 a0=3 a1=7fff68fcef30 a2=0 a3=7fff68fcef1c items=0 ppid=2822 pid=2954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.156000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 07:53:04.165000 audit[2957]: NETFILTER_CFG table=nat:81 family=10 entries=2 op=nft_register_chain pid=2957 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.165000 audit[2957]: SYSCALL arch=c000003e syscall=46 success=yes exit=608 a0=3 a1=7ffc39f2ae70 a2=0 a3=7ffc39f2ae5c items=0 ppid=2822 pid=2957 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.165000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 07:53:04.168000 audit[2958]: NETFILTER_CFG table=filter:82 family=10 entries=1 op=nft_register_chain pid=2958 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.168000 audit[2958]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc66642440 a2=0 a3=7ffc6664242c items=0 ppid=2822 pid=2958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.168000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 13 07:53:04.174000 audit[2960]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=2960 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.174000 audit[2960]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffed9f67c00 a2=0 a3=7ffed9f67bec items=0 ppid=2822 pid=2960 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.174000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 07:53:04.183000 audit[2963]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_rule pid=2963 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.183000 audit[2963]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fff7de330e0 a2=0 a3=7fff7de330cc items=0 ppid=2822 pid=2963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.183000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 07:53:04.185000 audit[2964]: NETFILTER_CFG table=nat:85 family=10 entries=1 op=nft_register_chain pid=2964 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.185000 audit[2964]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe73765f40 a2=0 a3=7ffe73765f2c items=0 ppid=2822 pid=2964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.185000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 13 07:53:04.191000 audit[2966]: NETFILTER_CFG table=nat:86 family=10 entries=2 op=nft_register_chain pid=2966 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 07:53:04.191000 audit[2966]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7fff359287d0 a2=0 a3=7fff359287bc items=0 ppid=2822 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.191000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 13 07:53:04.198000 audit[2968]: NETFILTER_CFG table=filter:87 family=10 entries=3 op=nft_register_rule pid=2968 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 13 07:53:04.198000 audit[2968]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7ffcfd6f9960 a2=0 a3=7ffcfd6f994c items=0 ppid=2822 pid=2968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.198000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:04.199000 audit[2968]: NETFILTER_CFG table=nat:88 family=10 entries=7 op=nft_register_chain pid=2968 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 13 07:53:04.199000 audit[2968]: SYSCALL arch=c000003e syscall=46 success=yes exit=1968 a0=3 a1=7ffcfd6f9960 a2=0 a3=7ffcfd6f994c items=0 ppid=2822 pid=2968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:04.199000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:04.753586 kubelet[2569]: I0213 07:53:04.753566 2569 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-2cjzx" podStartSLOduration=2.753542432 podCreationTimestamp="2024-02-13 07:53:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 07:53:04.753206593 +0000 UTC m=+15.104258288" watchObservedRunningTime="2024-02-13 07:53:04.753542432 +0000 UTC m=+15.104594125" Feb 13 07:53:04.932246 update_engine[1448]: I0213 07:53:04.932179 1448 update_attempter.cc:509] Updating boot flags... Feb 13 07:53:05.698857 env[1458]: time="2024-02-13T07:53:05.698831850Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:05.699450 env[1458]: time="2024-02-13T07:53:05.699436149Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:05.700212 env[1458]: time="2024-02-13T07:53:05.700186280Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:05.701308 env[1458]: time="2024-02-13T07:53:05.701294372Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:715ac9a30f8a9579e44258af20de354715429e11836b493918e9e1a696e9b028,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:05.701587 env[1458]: time="2024-02-13T07:53:05.701572435Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\" returns image reference \"sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827\"" Feb 13 07:53:05.702873 env[1458]: time="2024-02-13T07:53:05.702858888Z" level=info msg="CreateContainer within sandbox \"6f1c235cb066f321070d62e41eb3613be9267d40bcf99ef77ae7e6ef1e5202c6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 07:53:05.708092 env[1458]: time="2024-02-13T07:53:05.708074764Z" level=info msg="CreateContainer within sandbox \"6f1c235cb066f321070d62e41eb3613be9267d40bcf99ef77ae7e6ef1e5202c6\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"eafa66436b64a255228a3f6d5a6a895232b01639b727f1df0abde9c9cd1f2fa8\"" Feb 13 07:53:05.708433 env[1458]: time="2024-02-13T07:53:05.708419475Z" level=info msg="StartContainer for \"eafa66436b64a255228a3f6d5a6a895232b01639b727f1df0abde9c9cd1f2fa8\"" Feb 13 07:53:05.716743 systemd[1]: Started cri-containerd-eafa66436b64a255228a3f6d5a6a895232b01639b727f1df0abde9c9cd1f2fa8.scope. Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit: BPF prog-id=120 op=LOAD Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000145c48 a2=10 a3=1c items=0 ppid=2758 pid=2992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:05.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561666136363433366236346132353532323861336636643561366138 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001456b0 a2=3c a3=8 items=0 ppid=2758 pid=2992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:05.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561666136363433366236346132353532323861336636643561366138 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit: BPF prog-id=121 op=LOAD Feb 13 07:53:05.723000 audit[2992]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001459d8 a2=78 a3=c00032fd40 items=0 ppid=2758 pid=2992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:05.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561666136363433366236346132353532323861336636643561366138 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit: BPF prog-id=122 op=LOAD Feb 13 07:53:05.723000 audit[2992]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000145770 a2=78 a3=c00032fd88 items=0 ppid=2758 pid=2992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:05.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561666136363433366236346132353532323861336636643561366138 Feb 13 07:53:05.723000 audit: BPF prog-id=122 op=UNLOAD Feb 13 07:53:05.723000 audit: BPF prog-id=121 op=UNLOAD Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { perfmon } for pid=2992 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit[2992]: AVC avc: denied { bpf } for pid=2992 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:05.723000 audit: BPF prog-id=123 op=LOAD Feb 13 07:53:05.723000 audit[2992]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000145c30 a2=78 a3=c000388198 items=0 ppid=2758 pid=2992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:05.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561666136363433366236346132353532323861336636643561366138 Feb 13 07:53:05.730204 env[1458]: time="2024-02-13T07:53:05.730156574Z" level=info msg="StartContainer for \"eafa66436b64a255228a3f6d5a6a895232b01639b727f1df0abde9c9cd1f2fa8\" returns successfully" Feb 13 07:53:05.747087 kubelet[2569]: I0213 07:53:05.747062 2569 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7ff8dc855-gmcwp" podStartSLOduration=1.793987206 podCreationTimestamp="2024-02-13 07:53:02 +0000 UTC" firstStartedPulling="2024-02-13 07:53:03.748767701 +0000 UTC m=+14.099819403" lastFinishedPulling="2024-02-13 07:53:05.701816216 +0000 UTC m=+16.052867908" observedRunningTime="2024-02-13 07:53:05.746933598 +0000 UTC m=+16.097985295" watchObservedRunningTime="2024-02-13 07:53:05.747035711 +0000 UTC m=+16.098087404" Feb 13 07:53:07.395000 audit[3037]: NETFILTER_CFG table=filter:89 family=2 entries=15 op=nft_register_rule pid=3037 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:07.395000 audit[3037]: SYSCALL arch=c000003e syscall=46 success=yes exit=5660 a0=3 a1=7ffd71f4ba40 a2=0 a3=7ffd71f4ba2c items=0 ppid=2822 pid=3037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:07.395000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:07.396000 audit[3037]: NETFILTER_CFG table=nat:90 family=2 entries=12 op=nft_register_rule pid=3037 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:07.396000 audit[3037]: SYSCALL arch=c000003e syscall=46 success=yes exit=2572 a0=3 a1=7ffd71f4ba40 a2=0 a3=31030 items=0 ppid=2822 pid=3037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:07.396000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:07.430000 audit[3039]: NETFILTER_CFG table=filter:91 family=2 entries=16 op=nft_register_rule pid=3039 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:07.430000 audit[3039]: SYSCALL arch=c000003e syscall=46 success=yes exit=5660 a0=3 a1=7fffd4579b30 a2=0 a3=7fffd4579b1c items=0 ppid=2822 pid=3039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:07.430000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:07.436000 audit[3039]: NETFILTER_CFG table=nat:92 family=2 entries=12 op=nft_register_rule pid=3039 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:07.436000 audit[3039]: SYSCALL arch=c000003e syscall=46 success=yes exit=2572 a0=3 a1=7fffd4579b30 a2=0 a3=31030 items=0 ppid=2822 pid=3039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:07.436000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:07.532944 kubelet[2569]: I0213 07:53:07.532866 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:53:07.543213 systemd[1]: Created slice kubepods-besteffort-podae1b9d70_8574_4fcb_9d3f_391d183b25cd.slice. Feb 13 07:53:07.566566 kubelet[2569]: I0213 07:53:07.566548 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:53:07.570090 systemd[1]: Created slice kubepods-besteffort-podaa3ede51_df50_4639_8214_f4851f8dfc8e.slice. Feb 13 07:53:07.620347 kubelet[2569]: I0213 07:53:07.620274 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-lib-modules\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.620677 kubelet[2569]: I0213 07:53:07.620460 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-policysync\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.620677 kubelet[2569]: I0213 07:53:07.620552 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-cni-net-dir\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.620677 kubelet[2569]: I0213 07:53:07.620616 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-cni-log-dir\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.621222 kubelet[2569]: I0213 07:53:07.620834 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-flexvol-driver-host\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.621222 kubelet[2569]: I0213 07:53:07.620981 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-var-run-calico\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.621222 kubelet[2569]: I0213 07:53:07.621118 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ae1b9d70-8574-4fcb-9d3f-391d183b25cd-typha-certs\") pod \"calico-typha-9448844cf-2sfsw\" (UID: \"ae1b9d70-8574-4fcb-9d3f-391d183b25cd\") " pod="calico-system/calico-typha-9448844cf-2sfsw" Feb 13 07:53:07.621621 kubelet[2569]: I0213 07:53:07.621239 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1b9d70-8574-4fcb-9d3f-391d183b25cd-tigera-ca-bundle\") pod \"calico-typha-9448844cf-2sfsw\" (UID: \"ae1b9d70-8574-4fcb-9d3f-391d183b25cd\") " pod="calico-system/calico-typha-9448844cf-2sfsw" Feb 13 07:53:07.621621 kubelet[2569]: I0213 07:53:07.621410 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-var-lib-calico\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.621621 kubelet[2569]: I0213 07:53:07.621516 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4xj4\" (UniqueName: \"kubernetes.io/projected/aa3ede51-df50-4639-8214-f4851f8dfc8e-kube-api-access-g4xj4\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.621621 kubelet[2569]: I0213 07:53:07.621604 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3ede51-df50-4639-8214-f4851f8dfc8e-tigera-ca-bundle\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.622232 kubelet[2569]: I0213 07:53:07.621830 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvx95\" (UniqueName: \"kubernetes.io/projected/ae1b9d70-8574-4fcb-9d3f-391d183b25cd-kube-api-access-kvx95\") pod \"calico-typha-9448844cf-2sfsw\" (UID: \"ae1b9d70-8574-4fcb-9d3f-391d183b25cd\") " pod="calico-system/calico-typha-9448844cf-2sfsw" Feb 13 07:53:07.622232 kubelet[2569]: I0213 07:53:07.621960 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-cni-bin-dir\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.622232 kubelet[2569]: I0213 07:53:07.622041 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aa3ede51-df50-4639-8214-f4851f8dfc8e-xtables-lock\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.622232 kubelet[2569]: I0213 07:53:07.622098 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/aa3ede51-df50-4639-8214-f4851f8dfc8e-node-certs\") pod \"calico-node-9qktd\" (UID: \"aa3ede51-df50-4639-8214-f4851f8dfc8e\") " pod="calico-system/calico-node-9qktd" Feb 13 07:53:07.694713 kubelet[2569]: I0213 07:53:07.694491 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:53:07.695337 kubelet[2569]: E0213 07:53:07.695280 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:07.723015 kubelet[2569]: I0213 07:53:07.722978 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d3077153-a5bc-4449-ba4f-3a1b2983528b-varrun\") pod \"csi-node-driver-8djc9\" (UID: \"d3077153-a5bc-4449-ba4f-3a1b2983528b\") " pod="calico-system/csi-node-driver-8djc9" Feb 13 07:53:07.723211 kubelet[2569]: I0213 07:53:07.723042 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3077153-a5bc-4449-ba4f-3a1b2983528b-kubelet-dir\") pod \"csi-node-driver-8djc9\" (UID: \"d3077153-a5bc-4449-ba4f-3a1b2983528b\") " pod="calico-system/csi-node-driver-8djc9" Feb 13 07:53:07.723304 kubelet[2569]: I0213 07:53:07.723274 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d3077153-a5bc-4449-ba4f-3a1b2983528b-socket-dir\") pod \"csi-node-driver-8djc9\" (UID: \"d3077153-a5bc-4449-ba4f-3a1b2983528b\") " pod="calico-system/csi-node-driver-8djc9" Feb 13 07:53:07.723384 kubelet[2569]: I0213 07:53:07.723346 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d3077153-a5bc-4449-ba4f-3a1b2983528b-registration-dir\") pod \"csi-node-driver-8djc9\" (UID: \"d3077153-a5bc-4449-ba4f-3a1b2983528b\") " pod="calico-system/csi-node-driver-8djc9" Feb 13 07:53:07.723653 kubelet[2569]: E0213 07:53:07.723621 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.723653 kubelet[2569]: W0213 07:53:07.723649 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.723786 kubelet[2569]: E0213 07:53:07.723683 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.723923 kubelet[2569]: E0213 07:53:07.723907 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.723982 kubelet[2569]: W0213 07:53:07.723923 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.723982 kubelet[2569]: E0213 07:53:07.723948 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.724151 kubelet[2569]: E0213 07:53:07.724140 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.724151 kubelet[2569]: W0213 07:53:07.724150 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.724246 kubelet[2569]: E0213 07:53:07.724164 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.724339 kubelet[2569]: E0213 07:53:07.724328 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.724339 kubelet[2569]: W0213 07:53:07.724338 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.724429 kubelet[2569]: E0213 07:53:07.724350 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.724523 kubelet[2569]: E0213 07:53:07.724512 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.724523 kubelet[2569]: W0213 07:53:07.724522 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.724672 kubelet[2569]: E0213 07:53:07.724537 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.724751 kubelet[2569]: E0213 07:53:07.724704 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.724751 kubelet[2569]: W0213 07:53:07.724714 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.724751 kubelet[2569]: E0213 07:53:07.724729 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.724909 kubelet[2569]: E0213 07:53:07.724897 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.724909 kubelet[2569]: W0213 07:53:07.724908 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.725009 kubelet[2569]: E0213 07:53:07.724922 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.725245 kubelet[2569]: E0213 07:53:07.725228 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.725245 kubelet[2569]: W0213 07:53:07.725245 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.725359 kubelet[2569]: E0213 07:53:07.725273 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.725492 kubelet[2569]: E0213 07:53:07.725480 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.725543 kubelet[2569]: W0213 07:53:07.725491 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.725543 kubelet[2569]: E0213 07:53:07.725510 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.725979 kubelet[2569]: E0213 07:53:07.725963 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.725979 kubelet[2569]: W0213 07:53:07.725978 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.726106 kubelet[2569]: E0213 07:53:07.726011 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.726207 kubelet[2569]: E0213 07:53:07.726194 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.726274 kubelet[2569]: W0213 07:53:07.726206 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.726274 kubelet[2569]: E0213 07:53:07.726244 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.726378 kubelet[2569]: E0213 07:53:07.726369 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.726378 kubelet[2569]: W0213 07:53:07.726377 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.726453 kubelet[2569]: E0213 07:53:07.726392 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.726543 kubelet[2569]: E0213 07:53:07.726532 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.726543 kubelet[2569]: W0213 07:53:07.726542 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.726667 kubelet[2569]: E0213 07:53:07.726553 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.726751 kubelet[2569]: E0213 07:53:07.726741 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.726799 kubelet[2569]: W0213 07:53:07.726751 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.726799 kubelet[2569]: E0213 07:53:07.726766 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.726914 kubelet[2569]: E0213 07:53:07.726903 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.726914 kubelet[2569]: W0213 07:53:07.726912 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.727023 kubelet[2569]: E0213 07:53:07.726926 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.727080 kubelet[2569]: E0213 07:53:07.727070 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.727080 kubelet[2569]: W0213 07:53:07.727078 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.727161 kubelet[2569]: E0213 07:53:07.727091 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.727224 kubelet[2569]: E0213 07:53:07.727215 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.727224 kubelet[2569]: W0213 07:53:07.727225 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.727321 kubelet[2569]: E0213 07:53:07.727235 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.727395 kubelet[2569]: E0213 07:53:07.727382 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.727444 kubelet[2569]: W0213 07:53:07.727397 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.727444 kubelet[2569]: E0213 07:53:07.727415 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.727544 kubelet[2569]: E0213 07:53:07.727535 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.727587 kubelet[2569]: W0213 07:53:07.727544 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.727587 kubelet[2569]: E0213 07:53:07.727559 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.727725 kubelet[2569]: E0213 07:53:07.727716 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.727725 kubelet[2569]: W0213 07:53:07.727724 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.727822 kubelet[2569]: E0213 07:53:07.727740 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.727996 kubelet[2569]: E0213 07:53:07.727985 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.727996 kubelet[2569]: W0213 07:53:07.727994 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.728096 kubelet[2569]: E0213 07:53:07.728007 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.728153 kubelet[2569]: E0213 07:53:07.728144 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.728196 kubelet[2569]: W0213 07:53:07.728155 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.728196 kubelet[2569]: E0213 07:53:07.728177 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.728362 kubelet[2569]: E0213 07:53:07.728350 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.728414 kubelet[2569]: W0213 07:53:07.728362 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.728414 kubelet[2569]: E0213 07:53:07.728397 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.728527 kubelet[2569]: E0213 07:53:07.728516 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.728527 kubelet[2569]: W0213 07:53:07.728524 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.728628 kubelet[2569]: E0213 07:53:07.728552 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.728700 kubelet[2569]: E0213 07:53:07.728690 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.728700 kubelet[2569]: W0213 07:53:07.728698 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.728800 kubelet[2569]: E0213 07:53:07.728722 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.728879 kubelet[2569]: E0213 07:53:07.728868 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.728879 kubelet[2569]: W0213 07:53:07.728877 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.729031 kubelet[2569]: E0213 07:53:07.728914 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.729095 kubelet[2569]: E0213 07:53:07.729079 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.729154 kubelet[2569]: W0213 07:53:07.729094 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.729154 kubelet[2569]: E0213 07:53:07.729132 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.729315 kubelet[2569]: E0213 07:53:07.729302 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.729315 kubelet[2569]: W0213 07:53:07.729313 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.729439 kubelet[2569]: E0213 07:53:07.729330 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.729515 kubelet[2569]: E0213 07:53:07.729503 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.729515 kubelet[2569]: W0213 07:53:07.729512 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.729652 kubelet[2569]: E0213 07:53:07.729526 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.729806 kubelet[2569]: E0213 07:53:07.729789 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.729806 kubelet[2569]: W0213 07:53:07.729805 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.729943 kubelet[2569]: E0213 07:53:07.729832 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.730038 kubelet[2569]: E0213 07:53:07.730023 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.730101 kubelet[2569]: W0213 07:53:07.730039 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.730101 kubelet[2569]: E0213 07:53:07.730064 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.730279 kubelet[2569]: E0213 07:53:07.730266 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.730340 kubelet[2569]: W0213 07:53:07.730282 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.730340 kubelet[2569]: E0213 07:53:07.730320 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.730450 kubelet[2569]: E0213 07:53:07.730433 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.730450 kubelet[2569]: W0213 07:53:07.730442 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.730570 kubelet[2569]: E0213 07:53:07.730470 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.730570 kubelet[2569]: I0213 07:53:07.730509 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghgt\" (UniqueName: \"kubernetes.io/projected/d3077153-a5bc-4449-ba4f-3a1b2983528b-kube-api-access-jghgt\") pod \"csi-node-driver-8djc9\" (UID: \"d3077153-a5bc-4449-ba4f-3a1b2983528b\") " pod="calico-system/csi-node-driver-8djc9" Feb 13 07:53:07.730709 kubelet[2569]: E0213 07:53:07.730580 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.730709 kubelet[2569]: W0213 07:53:07.730589 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.730709 kubelet[2569]: E0213 07:53:07.730620 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.730872 kubelet[2569]: E0213 07:53:07.730761 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.730872 kubelet[2569]: W0213 07:53:07.730770 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.730872 kubelet[2569]: E0213 07:53:07.730799 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.731050 kubelet[2569]: E0213 07:53:07.730906 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.731050 kubelet[2569]: W0213 07:53:07.730918 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.731050 kubelet[2569]: E0213 07:53:07.730936 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.731234 kubelet[2569]: E0213 07:53:07.731070 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.731234 kubelet[2569]: W0213 07:53:07.731079 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.731234 kubelet[2569]: E0213 07:53:07.731092 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.731382 kubelet[2569]: E0213 07:53:07.731235 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.731382 kubelet[2569]: W0213 07:53:07.731246 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.731382 kubelet[2569]: E0213 07:53:07.731260 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.731382 kubelet[2569]: E0213 07:53:07.731369 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.731382 kubelet[2569]: W0213 07:53:07.731376 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.731619 kubelet[2569]: E0213 07:53:07.731390 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.731619 kubelet[2569]: E0213 07:53:07.731534 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.731619 kubelet[2569]: W0213 07:53:07.731544 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.731619 kubelet[2569]: E0213 07:53:07.731563 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.731831 kubelet[2569]: E0213 07:53:07.731761 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.731831 kubelet[2569]: W0213 07:53:07.731770 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.731831 kubelet[2569]: E0213 07:53:07.731787 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.731979 kubelet[2569]: E0213 07:53:07.731968 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.731979 kubelet[2569]: W0213 07:53:07.731977 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.732063 kubelet[2569]: E0213 07:53:07.731991 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.732133 kubelet[2569]: E0213 07:53:07.732126 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.732133 kubelet[2569]: W0213 07:53:07.732133 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.732206 kubelet[2569]: E0213 07:53:07.732144 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.732278 kubelet[2569]: E0213 07:53:07.732271 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.732278 kubelet[2569]: W0213 07:53:07.732278 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.732347 kubelet[2569]: E0213 07:53:07.732287 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.732419 kubelet[2569]: E0213 07:53:07.732410 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.732419 kubelet[2569]: W0213 07:53:07.732417 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.732508 kubelet[2569]: E0213 07:53:07.732428 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.732554 kubelet[2569]: E0213 07:53:07.732540 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.732554 kubelet[2569]: W0213 07:53:07.732546 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.732629 kubelet[2569]: E0213 07:53:07.732556 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.732728 kubelet[2569]: E0213 07:53:07.732718 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.732728 kubelet[2569]: W0213 07:53:07.732727 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.732808 kubelet[2569]: E0213 07:53:07.732738 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.732935 kubelet[2569]: E0213 07:53:07.732921 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.732988 kubelet[2569]: W0213 07:53:07.732936 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.732988 kubelet[2569]: E0213 07:53:07.732961 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.733145 kubelet[2569]: E0213 07:53:07.733133 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.733145 kubelet[2569]: W0213 07:53:07.733143 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.733244 kubelet[2569]: E0213 07:53:07.733161 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.733310 kubelet[2569]: E0213 07:53:07.733297 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.733364 kubelet[2569]: W0213 07:53:07.733309 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.733364 kubelet[2569]: E0213 07:53:07.733328 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.733520 kubelet[2569]: E0213 07:53:07.733508 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.733520 kubelet[2569]: W0213 07:53:07.733518 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.733646 kubelet[2569]: E0213 07:53:07.733537 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.733700 kubelet[2569]: E0213 07:53:07.733672 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.733700 kubelet[2569]: W0213 07:53:07.733680 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.733700 kubelet[2569]: E0213 07:53:07.733692 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.733863 kubelet[2569]: E0213 07:53:07.733831 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.733863 kubelet[2569]: W0213 07:53:07.733842 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.733969 kubelet[2569]: E0213 07:53:07.733874 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.734020 kubelet[2569]: E0213 07:53:07.733989 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.734020 kubelet[2569]: W0213 07:53:07.733996 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.734020 kubelet[2569]: E0213 07:53:07.734010 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.734173 kubelet[2569]: E0213 07:53:07.734135 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.734173 kubelet[2569]: W0213 07:53:07.734142 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.734173 kubelet[2569]: E0213 07:53:07.734154 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.734308 kubelet[2569]: E0213 07:53:07.734287 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.734308 kubelet[2569]: W0213 07:53:07.734294 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.734308 kubelet[2569]: E0213 07:53:07.734306 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.734458 kubelet[2569]: E0213 07:53:07.734413 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.734458 kubelet[2569]: W0213 07:53:07.734420 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.734458 kubelet[2569]: E0213 07:53:07.734432 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.734603 kubelet[2569]: E0213 07:53:07.734568 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.734603 kubelet[2569]: W0213 07:53:07.734575 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.734603 kubelet[2569]: E0213 07:53:07.734586 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.734757 kubelet[2569]: E0213 07:53:07.734695 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.734757 kubelet[2569]: W0213 07:53:07.734702 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.734757 kubelet[2569]: E0213 07:53:07.734713 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.735563 kubelet[2569]: E0213 07:53:07.735551 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.735563 kubelet[2569]: W0213 07:53:07.735563 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.735671 kubelet[2569]: E0213 07:53:07.735577 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.737404 kubelet[2569]: E0213 07:53:07.737392 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.737404 kubelet[2569]: W0213 07:53:07.737401 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.737493 kubelet[2569]: E0213 07:53:07.737413 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.833234 kubelet[2569]: E0213 07:53:07.833206 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.833234 kubelet[2569]: W0213 07:53:07.833228 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.833482 kubelet[2569]: E0213 07:53:07.833256 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.833547 kubelet[2569]: E0213 07:53:07.833496 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.833547 kubelet[2569]: W0213 07:53:07.833508 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.833547 kubelet[2569]: E0213 07:53:07.833527 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.833779 kubelet[2569]: E0213 07:53:07.833763 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.833779 kubelet[2569]: W0213 07:53:07.833775 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.833893 kubelet[2569]: E0213 07:53:07.833798 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.834041 kubelet[2569]: E0213 07:53:07.834027 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.834041 kubelet[2569]: W0213 07:53:07.834039 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.834156 kubelet[2569]: E0213 07:53:07.834057 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.834262 kubelet[2569]: E0213 07:53:07.834250 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.834262 kubelet[2569]: W0213 07:53:07.834261 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.834358 kubelet[2569]: E0213 07:53:07.834277 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.834524 kubelet[2569]: E0213 07:53:07.834511 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.834524 kubelet[2569]: W0213 07:53:07.834522 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.834682 kubelet[2569]: E0213 07:53:07.834538 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.834760 kubelet[2569]: E0213 07:53:07.834723 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.834760 kubelet[2569]: W0213 07:53:07.834733 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.834876 kubelet[2569]: E0213 07:53:07.834769 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.834956 kubelet[2569]: E0213 07:53:07.834944 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.834956 kubelet[2569]: W0213 07:53:07.834955 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.835082 kubelet[2569]: E0213 07:53:07.834990 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.835176 kubelet[2569]: E0213 07:53:07.835164 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.835176 kubelet[2569]: W0213 07:53:07.835174 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.835269 kubelet[2569]: E0213 07:53:07.835206 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.835359 kubelet[2569]: E0213 07:53:07.835348 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.835359 kubelet[2569]: W0213 07:53:07.835358 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.835454 kubelet[2569]: E0213 07:53:07.835395 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.835582 kubelet[2569]: E0213 07:53:07.835570 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.835667 kubelet[2569]: W0213 07:53:07.835584 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.835667 kubelet[2569]: E0213 07:53:07.835614 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.835818 kubelet[2569]: E0213 07:53:07.835774 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.835818 kubelet[2569]: W0213 07:53:07.835787 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.835818 kubelet[2569]: E0213 07:53:07.835812 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.836117 kubelet[2569]: E0213 07:53:07.836101 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.836117 kubelet[2569]: W0213 07:53:07.836115 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.836274 kubelet[2569]: E0213 07:53:07.836139 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.836357 kubelet[2569]: E0213 07:53:07.836331 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.836357 kubelet[2569]: W0213 07:53:07.836345 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.836496 kubelet[2569]: E0213 07:53:07.836366 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.836574 kubelet[2569]: E0213 07:53:07.836534 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.836574 kubelet[2569]: W0213 07:53:07.836543 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.836756 kubelet[2569]: E0213 07:53:07.836572 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.836756 kubelet[2569]: E0213 07:53:07.836722 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.836756 kubelet[2569]: W0213 07:53:07.836732 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.836983 kubelet[2569]: E0213 07:53:07.836759 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.836983 kubelet[2569]: E0213 07:53:07.836891 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.836983 kubelet[2569]: W0213 07:53:07.836900 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.836983 kubelet[2569]: E0213 07:53:07.836928 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.837240 kubelet[2569]: E0213 07:53:07.837046 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.837240 kubelet[2569]: W0213 07:53:07.837055 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.837240 kubelet[2569]: E0213 07:53:07.837079 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.837240 kubelet[2569]: E0213 07:53:07.837220 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.837240 kubelet[2569]: W0213 07:53:07.837229 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.837240 kubelet[2569]: E0213 07:53:07.837245 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.837509 kubelet[2569]: E0213 07:53:07.837394 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.837509 kubelet[2569]: W0213 07:53:07.837403 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.837509 kubelet[2569]: E0213 07:53:07.837419 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.837654 kubelet[2569]: E0213 07:53:07.837592 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.837654 kubelet[2569]: W0213 07:53:07.837601 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.837654 kubelet[2569]: E0213 07:53:07.837618 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.837818 kubelet[2569]: E0213 07:53:07.837778 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.837818 kubelet[2569]: W0213 07:53:07.837788 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.837818 kubelet[2569]: E0213 07:53:07.837804 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.837985 kubelet[2569]: E0213 07:53:07.837964 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.837985 kubelet[2569]: W0213 07:53:07.837974 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.838096 kubelet[2569]: E0213 07:53:07.837989 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.838254 kubelet[2569]: E0213 07:53:07.838237 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.838254 kubelet[2569]: W0213 07:53:07.838249 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.838378 kubelet[2569]: E0213 07:53:07.838266 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.838469 kubelet[2569]: E0213 07:53:07.838458 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.838469 kubelet[2569]: W0213 07:53:07.838468 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.838573 kubelet[2569]: E0213 07:53:07.838483 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.846100 env[1458]: time="2024-02-13T07:53:07.846058829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9448844cf-2sfsw,Uid:ae1b9d70-8574-4fcb-9d3f-391d183b25cd,Namespace:calico-system,Attempt:0,}" Feb 13 07:53:07.848607 kubelet[2569]: E0213 07:53:07.848588 2569 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 07:53:07.848607 kubelet[2569]: W0213 07:53:07.848603 2569 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 07:53:07.848799 kubelet[2569]: E0213 07:53:07.848624 2569 plugins.go:729] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 07:53:07.872182 env[1458]: time="2024-02-13T07:53:07.872141663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9qktd,Uid:aa3ede51-df50-4639-8214-f4851f8dfc8e,Namespace:calico-system,Attempt:0,}" Feb 13 07:53:08.252797 env[1458]: time="2024-02-13T07:53:08.252628581Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 07:53:08.252797 env[1458]: time="2024-02-13T07:53:08.252726083Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 07:53:08.252797 env[1458]: time="2024-02-13T07:53:08.252746900Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 07:53:08.253122 env[1458]: time="2024-02-13T07:53:08.252929302Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/32e1480b406c92107c49416a481ad310090768900478462218b77118e0b37d34 pid=3147 runtime=io.containerd.runc.v2 Feb 13 07:53:08.253185 env[1458]: time="2024-02-13T07:53:08.253093279Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 07:53:08.253185 env[1458]: time="2024-02-13T07:53:08.253148365Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 07:53:08.253327 env[1458]: time="2024-02-13T07:53:08.253168265Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 07:53:08.253389 env[1458]: time="2024-02-13T07:53:08.253327388Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/ea8de6a2ed858380714ec179f757b318f4d83d80e1b98e8e7b10a03fae987ad8 pid=3148 runtime=io.containerd.runc.v2 Feb 13 07:53:08.269572 systemd[1]: Started cri-containerd-32e1480b406c92107c49416a481ad310090768900478462218b77118e0b37d34.scope. Feb 13 07:53:08.271871 systemd[1]: Started cri-containerd-ea8de6a2ed858380714ec179f757b318f4d83d80e1b98e8e7b10a03fae987ad8.scope. Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.309893 kernel: kauditd_printk_skb: 373 callbacks suppressed Feb 13 07:53:08.309958 kernel: audit: type=1400 audit(1707810788.282:1065): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.434985 kernel: audit: type=1400 audit(1707810788.282:1066): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.435023 kernel: audit: type=1400 audit(1707810788.282:1067): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497552 kernel: audit: type=1400 audit(1707810788.282:1068): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.623416 kernel: audit: type=1400 audit(1707810788.282:1069): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.623472 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 07:53:08.623497 kernel: audit: audit_lost=3 audit_rate_limit=0 audit_backlog_limit=64 Feb 13 07:53:08.623509 kernel: audit: backlog limit exceeded Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.650690 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 07:53:08.650725 kernel: audit: audit_lost=4 audit_rate_limit=0 audit_backlog_limit=64 Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.282000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.371000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.371000 audit: BPF prog-id=124 op=LOAD Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=3147 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.372000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653134383062343036633932313037633439343136613438316164 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001976b0 a2=3c a3=c items=0 ppid=3147 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.372000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653134383062343036633932313037633439343136613438316164 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.372000 audit: BPF prog-id=125 op=LOAD Feb 13 07:53:08.372000 audit[3166]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c00032ead0 items=0 ppid=3147 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.372000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653134383062343036633932313037633439343136613438316164 Feb 13 07:53:08.496000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit: BPF prog-id=126 op=LOAD Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000145c48 a2=10 a3=1c items=0 ppid=3148 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561386465366132656438353833383037313465633137396637353762 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001456b0 a2=3c a3=c items=0 ppid=3148 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561386465366132656438353833383037313465633137396637353762 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.497000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.496000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.499000 audit[3201]: NETFILTER_CFG table=filter:93 family=2 entries=16 op=nft_register_rule pid=3201 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:08.499000 audit[3201]: SYSCALL arch=c000003e syscall=46 success=yes exit=5660 a0=3 a1=7ffd1decec60 a2=0 a3=7ffd1decec4c items=0 ppid=2822 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.499000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:08.497000 audit[3167]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001459d8 a2=78 a3=c0001dfbb0 items=0 ppid=3148 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.496000 audit[3166]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=c00032eb18 items=0 ppid=3147 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561386465366132656438353833383037313465633137396637353762 Feb 13 07:53:08.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653134383062343036633932313037633439343136613438316164 Feb 13 07:53:08.753000 audit: BPF prog-id=127 op=UNLOAD Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit: BPF prog-id=125 op=UNLOAD Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { perfmon } for pid=3166 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit: BPF prog-id=129 op=LOAD Feb 13 07:53:08.753000 audit[3167]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000145770 a2=78 a3=c0001dfbf8 items=0 ppid=3148 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.753000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561386465366132656438353833383037313465633137396637353762 Feb 13 07:53:08.753000 audit: BPF prog-id=129 op=UNLOAD Feb 13 07:53:08.753000 audit: BPF prog-id=128 op=UNLOAD Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { perfmon } for pid=3167 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit[3166]: AVC avc: denied { bpf } for pid=3166 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit: BPF prog-id=130 op=LOAD Feb 13 07:53:08.753000 audit[3166]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c00032ef28 items=0 ppid=3147 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.753000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653134383062343036633932313037633439343136613438316164 Feb 13 07:53:08.753000 audit[3167]: AVC avc: denied { bpf } for pid=3167 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:08.753000 audit: BPF prog-id=131 op=LOAD Feb 13 07:53:08.753000 audit[3167]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000145c30 a2=78 a3=c0003c6008 items=0 ppid=3148 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.753000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561386465366132656438353833383037313465633137396637353762 Feb 13 07:53:08.759091 env[1458]: time="2024-02-13T07:53:08.759064383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9qktd,Uid:aa3ede51-df50-4639-8214-f4851f8dfc8e,Namespace:calico-system,Attempt:0,} returns sandbox id \"32e1480b406c92107c49416a481ad310090768900478462218b77118e0b37d34\"" Feb 13 07:53:08.759802 env[1458]: time="2024-02-13T07:53:08.759787760Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\"" Feb 13 07:53:08.753000 audit[3201]: NETFILTER_CFG table=nat:94 family=2 entries=12 op=nft_register_rule pid=3201 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:08.753000 audit[3201]: SYSCALL arch=c000003e syscall=46 success=yes exit=2572 a0=3 a1=7ffd1decec60 a2=0 a3=31030 items=0 ppid=2822 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:08.753000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:08.771272 env[1458]: time="2024-02-13T07:53:08.771246025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9448844cf-2sfsw,Uid:ae1b9d70-8574-4fcb-9d3f-391d183b25cd,Namespace:calico-system,Attempt:0,} returns sandbox id \"ea8de6a2ed858380714ec179f757b318f4d83d80e1b98e8e7b10a03fae987ad8\"" Feb 13 07:53:09.705847 kubelet[2569]: E0213 07:53:09.705758 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:11.705384 kubelet[2569]: E0213 07:53:11.705312 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:12.287813 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1631087834.mount: Deactivated successfully. Feb 13 07:53:13.705272 kubelet[2569]: E0213 07:53:13.705165 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:15.706004 kubelet[2569]: E0213 07:53:15.705938 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:17.705706 kubelet[2569]: E0213 07:53:17.705619 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:19.705491 kubelet[2569]: E0213 07:53:19.705388 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:21.705822 kubelet[2569]: E0213 07:53:21.705716 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:23.705225 kubelet[2569]: E0213 07:53:23.705122 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:25.705678 kubelet[2569]: E0213 07:53:25.705558 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:27.705190 kubelet[2569]: E0213 07:53:27.705082 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:29.705376 kubelet[2569]: E0213 07:53:29.705327 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:31.705154 kubelet[2569]: E0213 07:53:31.705053 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:33.706186 kubelet[2569]: E0213 07:53:33.706080 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:35.705252 kubelet[2569]: E0213 07:53:35.705207 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:35.738068 env[1458]: time="2024-02-13T07:53:35.738045247Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:35.738709 env[1458]: time="2024-02-13T07:53:35.738696522Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:35.740295 env[1458]: time="2024-02-13T07:53:35.740265509Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:35.741458 env[1458]: time="2024-02-13T07:53:35.741444339Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:b05edbd1f80db4ada229e6001a666a7dd36bb6ab617143684fb3d28abfc4b71e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:35.742563 env[1458]: time="2024-02-13T07:53:35.742548683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\" returns image reference \"sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a\"" Feb 13 07:53:35.742808 env[1458]: time="2024-02-13T07:53:35.742794053Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\"" Feb 13 07:53:35.743475 env[1458]: time="2024-02-13T07:53:35.743460080Z" level=info msg="CreateContainer within sandbox \"32e1480b406c92107c49416a481ad310090768900478462218b77118e0b37d34\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 07:53:35.748513 env[1458]: time="2024-02-13T07:53:35.748492624Z" level=info msg="CreateContainer within sandbox \"32e1480b406c92107c49416a481ad310090768900478462218b77118e0b37d34\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a328a3cea42c0893460a9a6704918c5635b68828957e87496fa8828098145f32\"" Feb 13 07:53:35.748848 env[1458]: time="2024-02-13T07:53:35.748813167Z" level=info msg="StartContainer for \"a328a3cea42c0893460a9a6704918c5635b68828957e87496fa8828098145f32\"" Feb 13 07:53:35.757539 systemd[1]: Started cri-containerd-a328a3cea42c0893460a9a6704918c5635b68828957e87496fa8828098145f32.scope. Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.790624 kernel: kauditd_printk_skb: 116 callbacks suppressed Feb 13 07:53:35.790687 kernel: audit: type=1400 audit(1707810815.764:1103): avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001bf6b0 a2=3c a3=7f8e519ea978 items=0 ppid=3147 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:35.852702 kernel: audit: type=1300 audit(1707810815.764:1103): arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001bf6b0 a2=3c a3=7f8e519ea978 items=0 ppid=3147 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:35.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133323861336365613432633038393334363061396136373034393138 Feb 13 07:53:36.030959 kernel: audit: type=1327 audit(1707810815.764:1103): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133323861336365613432633038393334363061396136373034393138 Feb 13 07:53:36.030995 kernel: audit: type=1400 audit(1707810815.764:1104): avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.090629 kernel: audit: type=1400 audit(1707810815.764:1104): avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.150351 kernel: audit: type=1400 audit(1707810815.764:1104): avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.210092 kernel: audit: type=1400 audit(1707810815.764:1104): avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.232587 env[1458]: time="2024-02-13T07:53:36.232508755Z" level=info msg="StartContainer for \"a328a3cea42c0893460a9a6704918c5635b68828957e87496fa8828098145f32\" returns successfully" Feb 13 07:53:36.270301 kernel: audit: type=1400 audit(1707810815.764:1104): avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.270565 systemd[1]: cri-containerd-a328a3cea42c0893460a9a6704918c5635b68828957e87496fa8828098145f32.scope: Deactivated successfully. Feb 13 07:53:36.279938 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a328a3cea42c0893460a9a6704918c5635b68828957e87496fa8828098145f32-rootfs.mount: Deactivated successfully. Feb 13 07:53:36.330511 kernel: audit: type=1400 audit(1707810815.764:1104): avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.390934 kernel: audit: type=1400 audit(1707810815.764:1104): avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.764000 audit: BPF prog-id=132 op=LOAD Feb 13 07:53:35.764000 audit[3228]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001bf9d8 a2=78 a3=c0003b60c8 items=0 ppid=3147 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:35.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133323861336365613432633038393334363061396136373034393138 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:35.851000 audit: BPF prog-id=133 op=LOAD Feb 13 07:53:35.851000 audit[3228]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c0001bf770 a2=78 a3=c0003b6118 items=0 ppid=3147 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:35.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133323861336365613432633038393334363061396136373034393138 Feb 13 07:53:36.030000 audit: BPF prog-id=133 op=UNLOAD Feb 13 07:53:36.030000 audit: BPF prog-id=132 op=UNLOAD Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { perfmon } for pid=3228 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit[3228]: AVC avc: denied { bpf } for pid=3228 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:36.030000 audit: BPF prog-id=134 op=LOAD Feb 13 07:53:36.030000 audit[3228]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001bfc30 a2=78 a3=c0003b61a8 items=0 ppid=3147 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:36.030000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133323861336365613432633038393334363061396136373034393138 Feb 13 07:53:36.457000 audit: BPF prog-id=134 op=UNLOAD Feb 13 07:53:36.570415 env[1458]: time="2024-02-13T07:53:36.570318514Z" level=info msg="shim disconnected" id=a328a3cea42c0893460a9a6704918c5635b68828957e87496fa8828098145f32 Feb 13 07:53:36.570808 env[1458]: time="2024-02-13T07:53:36.570413799Z" level=warning msg="cleaning up after shim disconnected" id=a328a3cea42c0893460a9a6704918c5635b68828957e87496fa8828098145f32 namespace=k8s.io Feb 13 07:53:36.570808 env[1458]: time="2024-02-13T07:53:36.570442383Z" level=info msg="cleaning up dead shim" Feb 13 07:53:36.585453 env[1458]: time="2024-02-13T07:53:36.585252930Z" level=warning msg="cleanup warnings time=\"2024-02-13T07:53:36Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3266 runtime=io.containerd.runc.v2\n" Feb 13 07:53:37.705285 kubelet[2569]: E0213 07:53:37.705174 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:39.706123 kubelet[2569]: E0213 07:53:39.706027 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:40.151611 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2572923140.mount: Deactivated successfully. Feb 13 07:53:41.705479 kubelet[2569]: E0213 07:53:41.705421 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:43.706050 kubelet[2569]: E0213 07:53:43.705937 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:45.704841 kubelet[2569]: E0213 07:53:45.704822 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:45.974213 env[1458]: time="2024-02-13T07:53:45.974131379Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:45.974897 env[1458]: time="2024-02-13T07:53:45.974856188Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:45.976328 env[1458]: time="2024-02-13T07:53:45.976292417Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:45.977262 env[1458]: time="2024-02-13T07:53:45.977224131Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:5f2d3b8c354a4eb6de46e786889913916e620c6c256982fb8d0f1a1d36a282bc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:53:45.978149 env[1458]: time="2024-02-13T07:53:45.978107418Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\" returns image reference \"sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c\"" Feb 13 07:53:45.978530 env[1458]: time="2024-02-13T07:53:45.978516167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\"" Feb 13 07:53:45.982331 env[1458]: time="2024-02-13T07:53:45.982312728Z" level=info msg="CreateContainer within sandbox \"ea8de6a2ed858380714ec179f757b318f4d83d80e1b98e8e7b10a03fae987ad8\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 07:53:45.986246 env[1458]: time="2024-02-13T07:53:45.986228258Z" level=info msg="CreateContainer within sandbox \"ea8de6a2ed858380714ec179f757b318f4d83d80e1b98e8e7b10a03fae987ad8\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f2f0d74781ef6554e6326db7b24a7612d06fd7a12eebb0ca13983979b4e9e4dc\"" Feb 13 07:53:45.986463 env[1458]: time="2024-02-13T07:53:45.986446312Z" level=info msg="StartContainer for \"f2f0d74781ef6554e6326db7b24a7612d06fd7a12eebb0ca13983979b4e9e4dc\"" Feb 13 07:53:45.994383 systemd[1]: Started cri-containerd-f2f0d74781ef6554e6326db7b24a7612d06fd7a12eebb0ca13983979b4e9e4dc.scope. Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.027071 kernel: kauditd_printk_skb: 34 callbacks suppressed Feb 13 07:53:46.027143 kernel: audit: type=1400 audit(1707810825.999:1110): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.146612 kernel: audit: type=1400 audit(1707810825.999:1111): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.146647 kernel: audit: type=1400 audit(1707810825.999:1112): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.206688 kernel: audit: type=1400 audit(1707810825.999:1113): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.268243 kernel: audit: type=1400 audit(1707810825.999:1114): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.330197 kernel: audit: type=1400 audit(1707810825.999:1115): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.330228 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.392165 kernel: audit: type=1400 audit(1707810825.999:1116): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.392194 kernel: audit: type=1400 audit(1707810825.999:1117): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.392210 kernel: audit: type=1400 audit(1707810825.999:1118): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:45.999000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit: BPF prog-id=135 op=LOAD Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000199c48 a2=10 a3=1c items=0 ppid=3148 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:46.086000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632663064373437383165663635353465363332366462376232346137 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001996b0 a2=3c a3=8 items=0 ppid=3148 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:46.086000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632663064373437383165663635353465363332366462376232346137 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.105000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:46.105000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b98ff0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:53:46.105000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:53:46.105000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:46.105000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0010a1340 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:53:46.105000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:53:46.178000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:46.178000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5a a1=c008e3b5f0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:53:46.178000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:53:46.178000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:46.178000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5a a1=c00e55c1a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:53:46.178000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:53:46.179000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:46.179000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5a a1=c003bd2ed0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:53:46.179000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:53:46.086000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.086000 audit: BPF prog-id=136 op=LOAD Feb 13 07:53:46.086000 audit[3292]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001999d8 a2=78 a3=c00025fd10 items=0 ppid=3148 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:46.086000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632663064373437383165663635353465363332366462376232346137 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.205000 audit: BPF prog-id=137 op=LOAD Feb 13 07:53:46.205000 audit[3292]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000199770 a2=78 a3=c00025fd58 items=0 ppid=3148 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:46.205000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632663064373437383165663635353465363332366462376232346137 Feb 13 07:53:46.328000 audit: BPF prog-id=137 op=UNLOAD Feb 13 07:53:46.328000 audit: BPF prog-id=136 op=UNLOAD Feb 13 07:53:46.328000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.328000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.328000 audit[3292]: AVC avc: denied { bpf } for pid=3292 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.328000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.328000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.328000 audit[3292]: AVC avc: denied { perfmon } for pid=3292 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:53:46.328000 audit[3292]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000199c30 a2=78 a3=c00034a168 items=0 ppid=3148 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:46.328000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632663064373437383165663635353465363332366462376232346137 Feb 13 07:53:46.618685 env[1458]: time="2024-02-13T07:53:46.618647335Z" level=info msg="StartContainer for \"f2f0d74781ef6554e6326db7b24a7612d06fd7a12eebb0ca13983979b4e9e4dc\" returns successfully" Feb 13 07:53:46.872411 kubelet[2569]: I0213 07:53:46.872344 2569 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-9448844cf-2sfsw" podStartSLOduration=2.665784542 podCreationTimestamp="2024-02-13 07:53:07 +0000 UTC" firstStartedPulling="2024-02-13 07:53:08.771828277 +0000 UTC m=+19.122879974" lastFinishedPulling="2024-02-13 07:53:45.97831777 +0000 UTC m=+56.329369465" observedRunningTime="2024-02-13 07:53:46.871786068 +0000 UTC m=+57.222837763" watchObservedRunningTime="2024-02-13 07:53:46.872274033 +0000 UTC m=+57.223325725" Feb 13 07:53:46.881000 audit[3337]: NETFILTER_CFG table=filter:95 family=2 entries=15 op=nft_register_rule pid=3337 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:46.881000 audit[3337]: SYSCALL arch=c000003e syscall=46 success=yes exit=4956 a0=3 a1=7fff1251f780 a2=0 a3=7fff1251f76c items=0 ppid=2822 pid=3337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:46.881000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:46.882000 audit[3337]: NETFILTER_CFG table=nat:96 family=2 entries=19 op=nft_register_chain pid=3337 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 07:53:46.882000 audit[3337]: SYSCALL arch=c000003e syscall=46 success=yes exit=6068 a0=3 a1=7fff1251f780 a2=0 a3=7fff1251f76c items=0 ppid=2822 pid=3337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:53:46.882000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 07:53:46.921000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:46.921000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c005b78cf0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:53:46.921000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:53:46.921000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:46.921000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c008fe8ed0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:53:46.921000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:53:46.921000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:46.921000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c0091689c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:53:46.921000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:53:47.705877 kubelet[2569]: E0213 07:53:47.705786 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:49.408646 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1412386634.mount: Deactivated successfully. Feb 13 07:53:49.705821 kubelet[2569]: E0213 07:53:49.705739 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:50.856000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:50.856000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b22fc0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:53:50.856000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:53:50.860000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:50.860000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0007abfc0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:53:50.860000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:53:50.863000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:50.863000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0007abfe0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:53:50.863000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:53:50.866000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:53:50.866000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000d7e800 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:53:50.866000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:53:51.706127 kubelet[2569]: E0213 07:53:51.706027 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:53.705942 kubelet[2569]: E0213 07:53:53.705840 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:55.705398 kubelet[2569]: E0213 07:53:55.705329 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:57.706055 kubelet[2569]: E0213 07:53:57.705955 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:53:59.706781 kubelet[2569]: E0213 07:53:59.706671 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:01.705917 kubelet[2569]: E0213 07:54:01.705825 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:03.706010 kubelet[2569]: E0213 07:54:03.705911 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:05.706070 kubelet[2569]: E0213 07:54:05.705967 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:07.705599 kubelet[2569]: E0213 07:54:07.705502 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:09.705358 kubelet[2569]: E0213 07:54:09.705298 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:11.706130 kubelet[2569]: E0213 07:54:11.706071 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:13.705126 kubelet[2569]: E0213 07:54:13.705027 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:15.706034 kubelet[2569]: E0213 07:54:15.705941 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:17.705769 kubelet[2569]: E0213 07:54:17.705718 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:19.706137 kubelet[2569]: E0213 07:54:19.706046 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:21.705292 kubelet[2569]: E0213 07:54:21.705188 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:23.706000 kubelet[2569]: E0213 07:54:23.705932 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:25.705654 kubelet[2569]: E0213 07:54:25.705572 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:27.705789 kubelet[2569]: E0213 07:54:27.705729 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:29.705718 kubelet[2569]: E0213 07:54:29.705608 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:31.705904 kubelet[2569]: E0213 07:54:31.705796 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:33.705824 kubelet[2569]: E0213 07:54:33.705728 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:35.705678 kubelet[2569]: E0213 07:54:35.705590 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:37.705643 kubelet[2569]: E0213 07:54:37.705575 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:39.706555 kubelet[2569]: E0213 07:54:39.706481 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:41.707268 kubelet[2569]: E0213 07:54:41.707163 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:43.705942 kubelet[2569]: E0213 07:54:43.705843 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:45.705327 kubelet[2569]: E0213 07:54:45.705212 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:46.107000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.134354 kernel: kauditd_printk_skb: 99 callbacks suppressed Feb 13 07:54:46.134396 kernel: audit: type=1400 audit(1707810886.107:1142): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.107000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000353620 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:54:46.349309 kernel: audit: type=1300 audit(1707810886.107:1142): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000353620 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:54:46.349343 kernel: audit: type=1327 audit(1707810886.107:1142): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:54:46.107000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:54:46.442975 kernel: audit: type=1400 audit(1707810886.107:1143): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.107000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.536804 kernel: audit: type=1300 audit(1707810886.107:1143): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001ff9bf0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:54:46.107000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001ff9bf0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:54:46.660375 kernel: audit: type=1327 audit(1707810886.107:1143): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:54:46.107000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:54:46.753915 kernel: audit: type=1400 audit(1707810886.179:1144): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.179000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.844193 kernel: audit: type=1300 audit(1707810886.179:1144): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008c43a60 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:54:46.179000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008c43a60 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:54:46.942637 kernel: audit: type=1327 audit(1707810886.179:1144): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:54:46.179000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:54:47.036240 kernel: audit: type=1400 audit(1707810886.179:1145): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.179000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.179000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0158416b0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:54:46.179000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:54:46.181000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.181000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c015841710 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:54:46.181000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:54:46.922000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.922000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00a76d4d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:54:46.922000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:54:46.922000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.922000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00ad178c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:54:46.922000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:54:46.922000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:46.922000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0158417d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:54:46.922000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:54:47.705512 kubelet[2569]: E0213 07:54:47.705456 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:49.704703 kubelet[2569]: E0213 07:54:49.704678 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:49.716891 kubelet[2569]: E0213 07:54:49.716865 2569 kubelet_node_status.go:452] "Node not becoming ready in time after startup" Feb 13 07:54:49.756577 kubelet[2569]: E0213 07:54:49.756537 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:54:50.858000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:50.858000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0006a5d00 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:54:50.858000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:54:50.862000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:50.862000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000353e40 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:54:50.862000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:54:50.864000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:50.864000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0006a5e20 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:54:50.864000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:54:50.868000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:54:50.868000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b230c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:54:50.868000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:54:51.705977 kubelet[2569]: E0213 07:54:51.705879 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:53.705499 kubelet[2569]: E0213 07:54:53.705391 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:54.758047 kubelet[2569]: E0213 07:54:54.757948 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:54:55.705529 kubelet[2569]: E0213 07:54:55.705430 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:57.705863 kubelet[2569]: E0213 07:54:57.705805 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:59.706740 kubelet[2569]: E0213 07:54:59.706647 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:54:59.759393 kubelet[2569]: E0213 07:54:59.759300 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:01.705855 kubelet[2569]: E0213 07:55:01.705757 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:03.705958 kubelet[2569]: E0213 07:55:03.705890 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:04.760541 kubelet[2569]: E0213 07:55:04.760440 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:05.706117 kubelet[2569]: E0213 07:55:05.706008 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:07.705354 kubelet[2569]: E0213 07:55:07.705257 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:09.705644 kubelet[2569]: E0213 07:55:09.705609 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:09.761771 kubelet[2569]: E0213 07:55:09.761747 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:11.705669 kubelet[2569]: E0213 07:55:11.705594 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:13.706068 kubelet[2569]: E0213 07:55:13.705967 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:14.763169 kubelet[2569]: E0213 07:55:14.763069 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:15.706008 kubelet[2569]: E0213 07:55:15.705894 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:17.705288 kubelet[2569]: E0213 07:55:17.705102 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:19.705541 kubelet[2569]: E0213 07:55:19.705468 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:19.764237 kubelet[2569]: E0213 07:55:19.764132 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:21.705337 kubelet[2569]: E0213 07:55:21.705270 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:23.705438 kubelet[2569]: E0213 07:55:23.705370 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:24.764636 kubelet[2569]: E0213 07:55:24.764591 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:25.706182 kubelet[2569]: E0213 07:55:25.706120 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:27.705940 kubelet[2569]: E0213 07:55:27.705871 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:29.705337 kubelet[2569]: E0213 07:55:29.705291 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:29.766128 kubelet[2569]: E0213 07:55:29.766035 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:31.705973 kubelet[2569]: E0213 07:55:31.705901 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:33.706210 kubelet[2569]: E0213 07:55:33.706106 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:34.767992 kubelet[2569]: E0213 07:55:34.767901 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:35.706032 kubelet[2569]: E0213 07:55:35.705931 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:37.706106 kubelet[2569]: E0213 07:55:37.706003 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:39.705745 kubelet[2569]: E0213 07:55:39.705602 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:39.769953 kubelet[2569]: E0213 07:55:39.769888 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:41.705345 kubelet[2569]: E0213 07:55:41.705247 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:43.705481 kubelet[2569]: E0213 07:55:43.705408 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:44.771137 kubelet[2569]: E0213 07:55:44.771036 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:45.705781 kubelet[2569]: E0213 07:55:45.705708 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:46.107000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.135695 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 07:55:46.135751 kernel: audit: type=1400 audit(1707810946.107:1154): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.107000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000c24420 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:55:46.347018 kernel: audit: type=1300 audit(1707810946.107:1154): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000c24420 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:55:46.347055 kernel: audit: type=1327 audit(1707810946.107:1154): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:55:46.107000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:55:46.440105 kernel: audit: type=1400 audit(1707810946.107:1155): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.107000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.531679 kernel: audit: type=1300 audit(1707810946.107:1155): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002a86ba0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:55:46.107000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002a86ba0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:55:46.652376 kernel: audit: type=1327 audit(1707810946.107:1155): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:55:46.107000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:55:46.746627 kernel: audit: type=1400 audit(1707810946.180:1156): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.180000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.837898 kernel: audit: type=1300 audit(1707810946.180:1156): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00588f260 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:55:46.180000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00588f260 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:55:46.937703 kernel: audit: type=1327 audit(1707810946.180:1156): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:55:46.180000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:55:46.180000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:47.121943 kernel: audit: type=1400 audit(1707810946.180:1157): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.180000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00ae94690 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:55:46.180000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:55:46.181000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.181000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00ae94720 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:55:46.181000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:55:46.922000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.922000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00ba72560 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:55:46.922000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:55:46.922000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.922000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00ae948d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:55:46.922000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:55:46.922000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:46.922000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00ae94900 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:55:46.922000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:55:47.705669 kubelet[2569]: E0213 07:55:47.705574 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:49.705609 kubelet[2569]: E0213 07:55:49.705594 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:49.772462 kubelet[2569]: E0213 07:55:49.772403 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:50.858000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:50.858000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00015f8a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:55:50.858000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:55:50.862000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:50.862000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000c246e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:55:50.862000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:55:50.864000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:50.864000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001b220c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:55:50.864000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:55:50.869000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:55:50.869000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001b22100 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:55:50.869000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:55:51.705433 kubelet[2569]: E0213 07:55:51.705365 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:53.705593 kubelet[2569]: E0213 07:55:53.705489 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:54.774471 kubelet[2569]: E0213 07:55:54.774364 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:55:55.705789 kubelet[2569]: E0213 07:55:55.705680 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:57.705118 kubelet[2569]: E0213 07:55:57.705012 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:59.705809 kubelet[2569]: E0213 07:55:59.705694 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:55:59.776084 kubelet[2569]: E0213 07:55:59.775978 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:56:01.705929 kubelet[2569]: E0213 07:56:01.705817 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:03.704934 kubelet[2569]: E0213 07:56:03.704884 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:04.476889 env[1458]: time="2024-02-13T07:56:04.476832040Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:56:04.477472 env[1458]: time="2024-02-13T07:56:04.477429406Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:56:04.478389 env[1458]: time="2024-02-13T07:56:04.478345809Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:56:04.479344 env[1458]: time="2024-02-13T07:56:04.479310746Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:d943b4c23e82a39b0186a1a3b2fe8f728e543d503df72d7be521501a82b7e7b4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 07:56:04.479874 env[1458]: time="2024-02-13T07:56:04.479837769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\" returns image reference \"sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93\"" Feb 13 07:56:04.492408 env[1458]: time="2024-02-13T07:56:04.492388851Z" level=info msg="CreateContainer within sandbox \"32e1480b406c92107c49416a481ad310090768900478462218b77118e0b37d34\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 07:56:04.497746 env[1458]: time="2024-02-13T07:56:04.497688717Z" level=info msg="CreateContainer within sandbox \"32e1480b406c92107c49416a481ad310090768900478462218b77118e0b37d34\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"91a6e97e313d064e8263a5c1ebd5dd0dbaadfd55111fa9c27b24ff918223dc53\"" Feb 13 07:56:04.498098 env[1458]: time="2024-02-13T07:56:04.498072283Z" level=info msg="StartContainer for \"91a6e97e313d064e8263a5c1ebd5dd0dbaadfd55111fa9c27b24ff918223dc53\"" Feb 13 07:56:04.508586 systemd[1]: Started cri-containerd-91a6e97e313d064e8263a5c1ebd5dd0dbaadfd55111fa9c27b24ff918223dc53.scope. Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.542266 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 07:56:04.542343 kernel: audit: type=1400 audit(1707810964.513:1166): avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001476b0 a2=3c a3=7fb551ac3b18 items=0 ppid=3147 pid=3363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:56:04.701779 kernel: audit: type=1300 audit(1707810964.513:1166): arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001476b0 a2=3c a3=7fb551ac3b18 items=0 ppid=3147 pid=3363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:56:04.701810 kernel: audit: type=1327 audit(1707810964.513:1166): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931613665393765333133643036346538323633613563316562643564 Feb 13 07:56:04.513000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931613665393765333133643036346538323633613563316562643564 Feb 13 07:56:04.776391 kubelet[2569]: E0213 07:56:04.776369 2569 kubelet.go:2760] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 07:56:04.793974 kernel: audit: type=1400 audit(1707810964.513:1167): avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.856838 kernel: audit: type=1400 audit(1707810964.513:1167): avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.919785 kernel: audit: type=1400 audit(1707810964.513:1167): avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.982998 kernel: audit: type=1400 audit(1707810964.513:1167): avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:05.013703 env[1458]: time="2024-02-13T07:56:05.013673003Z" level=info msg="StartContainer for \"91a6e97e313d064e8263a5c1ebd5dd0dbaadfd55111fa9c27b24ff918223dc53\" returns successfully" Feb 13 07:56:05.046947 kernel: audit: type=1400 audit(1707810964.513:1167): avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:05.110924 kernel: audit: type=1400 audit(1707810964.513:1167): avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:05.238788 kernel: audit: type=1400 audit(1707810964.513:1167): avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.513000 audit: BPF prog-id=139 op=LOAD Feb 13 07:56:04.513000 audit[3363]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001479d8 a2=78 a3=c000387f68 items=0 ppid=3147 pid=3363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:56:04.513000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931613665393765333133643036346538323633613563316562643564 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.603000 audit: BPF prog-id=140 op=LOAD Feb 13 07:56:04.603000 audit[3363]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000147770 a2=78 a3=c000387fb8 items=0 ppid=3147 pid=3363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:56:04.603000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931613665393765333133643036346538323633613563316562643564 Feb 13 07:56:04.792000 audit: BPF prog-id=140 op=UNLOAD Feb 13 07:56:04.792000 audit: BPF prog-id=139 op=UNLOAD Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { perfmon } for pid=3363 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit[3363]: AVC avc: denied { bpf } for pid=3363 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 07:56:04.792000 audit: BPF prog-id=141 op=LOAD Feb 13 07:56:04.792000 audit[3363]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000147c30 a2=78 a3=c0003e4048 items=0 ppid=3147 pid=3363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 07:56:04.792000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931613665393765333133643036346538323633613563316562643564 Feb 13 07:56:05.680352 systemd[1]: cri-containerd-91a6e97e313d064e8263a5c1ebd5dd0dbaadfd55111fa9c27b24ff918223dc53.scope: Deactivated successfully. Feb 13 07:56:05.690168 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-91a6e97e313d064e8263a5c1ebd5dd0dbaadfd55111fa9c27b24ff918223dc53-rootfs.mount: Deactivated successfully. Feb 13 07:56:05.693000 audit: BPF prog-id=141 op=UNLOAD Feb 13 07:56:05.704765 kubelet[2569]: E0213 07:56:05.704751 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:06.179489 env[1458]: time="2024-02-13T07:56:06.179387975Z" level=info msg="shim disconnected" id=91a6e97e313d064e8263a5c1ebd5dd0dbaadfd55111fa9c27b24ff918223dc53 Feb 13 07:56:06.179489 env[1458]: time="2024-02-13T07:56:06.179486514Z" level=warning msg="cleaning up after shim disconnected" id=91a6e97e313d064e8263a5c1ebd5dd0dbaadfd55111fa9c27b24ff918223dc53 namespace=k8s.io Feb 13 07:56:06.180493 env[1458]: time="2024-02-13T07:56:06.179516030Z" level=info msg="cleaning up dead shim" Feb 13 07:56:06.195100 env[1458]: time="2024-02-13T07:56:06.194982890Z" level=warning msg="cleanup warnings time=\"2024-02-13T07:56:06Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3423 runtime=io.containerd.runc.v2\n" Feb 13 07:56:06.232434 env[1458]: time="2024-02-13T07:56:06.232320557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.27.0\"" Feb 13 07:56:07.705138 kubelet[2569]: E0213 07:56:07.705024 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:09.706297 kubelet[2569]: E0213 07:56:09.706234 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:11.710317 systemd[1]: Created slice kubepods-besteffort-podd3077153_a5bc_4449_ba4f_3a1b2983528b.slice. Feb 13 07:56:11.712298 env[1458]: time="2024-02-13T07:56:11.712229837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8djc9,Uid:d3077153-a5bc-4449-ba4f-3a1b2983528b,Namespace:calico-system,Attempt:0,}" Feb 13 07:56:11.746005 env[1458]: time="2024-02-13T07:56:11.745929665Z" level=error msg="Failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:11.746193 env[1458]: time="2024-02-13T07:56:11.746149796Z" level=error msg="encountered an error cleaning up failed sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:11.746193 env[1458]: time="2024-02-13T07:56:11.746181003Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8djc9,Uid:d3077153-a5bc-4449-ba4f-3a1b2983528b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:11.746418 kubelet[2569]: E0213 07:56:11.746377 2569 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:11.746577 kubelet[2569]: E0213 07:56:11.746421 2569 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8djc9" Feb 13 07:56:11.746577 kubelet[2569]: E0213 07:56:11.746437 2569 kuberuntime_manager.go:1122] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8djc9" Feb 13 07:56:11.746577 kubelet[2569]: E0213 07:56:11.746479 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8djc9_calico-system(d3077153-a5bc-4449-ba4f-3a1b2983528b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8djc9_calico-system(d3077153-a5bc-4449-ba4f-3a1b2983528b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:11.747221 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e-shm.mount: Deactivated successfully. Feb 13 07:56:12.247439 kubelet[2569]: I0213 07:56:12.247339 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:56:12.248672 env[1458]: time="2024-02-13T07:56:12.248540837Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:56:12.301183 env[1458]: time="2024-02-13T07:56:12.301096853Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:12.301435 kubelet[2569]: E0213 07:56:12.301383 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:56:12.301534 kubelet[2569]: E0213 07:56:12.301444 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:56:12.301534 kubelet[2569]: E0213 07:56:12.301490 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:12.301534 kubelet[2569]: E0213 07:56:12.301524 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:13.731713 kubelet[2569]: I0213 07:56:13.731605 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:56:13.733158 kubelet[2569]: I0213 07:56:13.733087 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:56:13.734109 kubelet[2569]: I0213 07:56:13.734029 2569 topology_manager.go:212] "Topology Admit Handler" Feb 13 07:56:13.747238 systemd[1]: Created slice kubepods-burstable-pod9ec457c2_fc28_4626_897d_f9a56a1fa755.slice. Feb 13 07:56:13.759744 systemd[1]: Created slice kubepods-burstable-pod18082ea3_5d5e_4eed_963b_be8271107e06.slice. Feb 13 07:56:13.764364 systemd[1]: Created slice kubepods-besteffort-pod59431b79_ecac_4529_9083_2bad55873c23.slice. Feb 13 07:56:13.840395 kubelet[2569]: I0213 07:56:13.840289 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ec457c2-fc28-4626-897d-f9a56a1fa755-config-volume\") pod \"coredns-5d78c9869d-qrnjl\" (UID: \"9ec457c2-fc28-4626-897d-f9a56a1fa755\") " pod="kube-system/coredns-5d78c9869d-qrnjl" Feb 13 07:56:13.840749 kubelet[2569]: I0213 07:56:13.840563 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m86vz\" (UniqueName: \"kubernetes.io/projected/59431b79-ecac-4529-9083-2bad55873c23-kube-api-access-m86vz\") pod \"calico-kube-controllers-846b88998b-4vbpv\" (UID: \"59431b79-ecac-4529-9083-2bad55873c23\") " pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" Feb 13 07:56:13.840903 kubelet[2569]: I0213 07:56:13.840789 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mq84\" (UniqueName: \"kubernetes.io/projected/18082ea3-5d5e-4eed-963b-be8271107e06-kube-api-access-7mq84\") pod \"coredns-5d78c9869d-7xbl5\" (UID: \"18082ea3-5d5e-4eed-963b-be8271107e06\") " pod="kube-system/coredns-5d78c9869d-7xbl5" Feb 13 07:56:13.841028 kubelet[2569]: I0213 07:56:13.840904 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18082ea3-5d5e-4eed-963b-be8271107e06-config-volume\") pod \"coredns-5d78c9869d-7xbl5\" (UID: \"18082ea3-5d5e-4eed-963b-be8271107e06\") " pod="kube-system/coredns-5d78c9869d-7xbl5" Feb 13 07:56:13.841028 kubelet[2569]: I0213 07:56:13.841021 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8tb9\" (UniqueName: \"kubernetes.io/projected/9ec457c2-fc28-4626-897d-f9a56a1fa755-kube-api-access-t8tb9\") pod \"coredns-5d78c9869d-qrnjl\" (UID: \"9ec457c2-fc28-4626-897d-f9a56a1fa755\") " pod="kube-system/coredns-5d78c9869d-qrnjl" Feb 13 07:56:13.841255 kubelet[2569]: I0213 07:56:13.841121 2569 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59431b79-ecac-4529-9083-2bad55873c23-tigera-ca-bundle\") pod \"calico-kube-controllers-846b88998b-4vbpv\" (UID: \"59431b79-ecac-4529-9083-2bad55873c23\") " pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" Feb 13 07:56:14.055219 env[1458]: time="2024-02-13T07:56:14.055090668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5d78c9869d-qrnjl,Uid:9ec457c2-fc28-4626-897d-f9a56a1fa755,Namespace:kube-system,Attempt:0,}" Feb 13 07:56:14.064592 env[1458]: time="2024-02-13T07:56:14.064462418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5d78c9869d-7xbl5,Uid:18082ea3-5d5e-4eed-963b-be8271107e06,Namespace:kube-system,Attempt:0,}" Feb 13 07:56:14.067705 env[1458]: time="2024-02-13T07:56:14.067586683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-846b88998b-4vbpv,Uid:59431b79-ecac-4529-9083-2bad55873c23,Namespace:calico-system,Attempt:0,}" Feb 13 07:56:14.148337 env[1458]: time="2024-02-13T07:56:14.148290825Z" level=error msg="Failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.148603 env[1458]: time="2024-02-13T07:56:14.148577647Z" level=error msg="encountered an error cleaning up failed sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.148664 env[1458]: time="2024-02-13T07:56:14.148626643Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5d78c9869d-qrnjl,Uid:9ec457c2-fc28-4626-897d-f9a56a1fa755,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.148862 kubelet[2569]: E0213 07:56:14.148818 2569 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.148862 kubelet[2569]: E0213 07:56:14.148863 2569 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-5d78c9869d-qrnjl" Feb 13 07:56:14.148967 kubelet[2569]: E0213 07:56:14.148884 2569 kuberuntime_manager.go:1122] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-5d78c9869d-qrnjl" Feb 13 07:56:14.148967 kubelet[2569]: E0213 07:56:14.148930 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-5d78c9869d-qrnjl_kube-system(9ec457c2-fc28-4626-897d-f9a56a1fa755)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-5d78c9869d-qrnjl_kube-system(9ec457c2-fc28-4626-897d-f9a56a1fa755)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:56:14.151111 env[1458]: time="2024-02-13T07:56:14.151057146Z" level=error msg="Failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.151361 env[1458]: time="2024-02-13T07:56:14.151315102Z" level=error msg="encountered an error cleaning up failed sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.151409 env[1458]: time="2024-02-13T07:56:14.151354582Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-846b88998b-4vbpv,Uid:59431b79-ecac-4529-9083-2bad55873c23,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.151527 kubelet[2569]: E0213 07:56:14.151514 2569 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.151571 kubelet[2569]: E0213 07:56:14.151551 2569 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" Feb 13 07:56:14.151609 kubelet[2569]: E0213 07:56:14.151575 2569 kuberuntime_manager.go:1122] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" Feb 13 07:56:14.151655 kubelet[2569]: E0213 07:56:14.151616 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-846b88998b-4vbpv_calico-system(59431b79-ecac-4529-9083-2bad55873c23)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-846b88998b-4vbpv_calico-system(59431b79-ecac-4529-9083-2bad55873c23)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:56:14.151725 env[1458]: time="2024-02-13T07:56:14.151679931Z" level=error msg="Failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.151971 env[1458]: time="2024-02-13T07:56:14.151940875Z" level=error msg="encountered an error cleaning up failed sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.152009 env[1458]: time="2024-02-13T07:56:14.151986701Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5d78c9869d-7xbl5,Uid:18082ea3-5d5e-4eed-963b-be8271107e06,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.152141 kubelet[2569]: E0213 07:56:14.152130 2569 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.152181 kubelet[2569]: E0213 07:56:14.152162 2569 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-5d78c9869d-7xbl5" Feb 13 07:56:14.152218 kubelet[2569]: E0213 07:56:14.152181 2569 kuberuntime_manager.go:1122] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-5d78c9869d-7xbl5" Feb 13 07:56:14.152253 kubelet[2569]: E0213 07:56:14.152219 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-5d78c9869d-7xbl5_kube-system(18082ea3-5d5e-4eed-963b-be8271107e06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-5d78c9869d-7xbl5_kube-system(18082ea3-5d5e-4eed-963b-be8271107e06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:56:14.252955 kubelet[2569]: I0213 07:56:14.252936 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:56:14.253287 env[1458]: time="2024-02-13T07:56:14.253263192Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:56:14.253531 kubelet[2569]: I0213 07:56:14.253490 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:56:14.253852 env[1458]: time="2024-02-13T07:56:14.253827216Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:56:14.254172 kubelet[2569]: I0213 07:56:14.254155 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:56:14.254471 env[1458]: time="2024-02-13T07:56:14.254452266Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:56:14.271059 env[1458]: time="2024-02-13T07:56:14.271012816Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.271269 kubelet[2569]: E0213 07:56:14.271246 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:56:14.271323 kubelet[2569]: E0213 07:56:14.271289 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:56:14.271323 kubelet[2569]: E0213 07:56:14.271318 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:14.271412 kubelet[2569]: E0213 07:56:14.271340 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:56:14.272327 env[1458]: time="2024-02-13T07:56:14.272294048Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.272458 kubelet[2569]: E0213 07:56:14.272445 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:56:14.272524 kubelet[2569]: E0213 07:56:14.272472 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:56:14.272572 kubelet[2569]: E0213 07:56:14.272526 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:14.272572 kubelet[2569]: E0213 07:56:14.272557 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:56:14.273622 env[1458]: time="2024-02-13T07:56:14.273569986Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:14.273743 kubelet[2569]: E0213 07:56:14.273695 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:56:14.273743 kubelet[2569]: E0213 07:56:14.273718 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:56:14.273743 kubelet[2569]: E0213 07:56:14.273745 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:14.273881 kubelet[2569]: E0213 07:56:14.273766 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:56:14.969324 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b-shm.mount: Deactivated successfully. Feb 13 07:56:14.969402 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100-shm.mount: Deactivated successfully. Feb 13 07:56:24.706757 env[1458]: time="2024-02-13T07:56:24.706604178Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:56:24.733222 env[1458]: time="2024-02-13T07:56:24.733157404Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:24.733371 kubelet[2569]: E0213 07:56:24.733332 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:56:24.733371 kubelet[2569]: E0213 07:56:24.733357 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:56:24.733541 kubelet[2569]: E0213 07:56:24.733379 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:24.733541 kubelet[2569]: E0213 07:56:24.733396 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:25.706464 env[1458]: time="2024-02-13T07:56:25.706369746Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:56:25.722517 env[1458]: time="2024-02-13T07:56:25.722482149Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:25.722746 kubelet[2569]: E0213 07:56:25.722683 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:56:25.722746 kubelet[2569]: E0213 07:56:25.722712 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:56:25.722746 kubelet[2569]: E0213 07:56:25.722735 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:25.722850 kubelet[2569]: E0213 07:56:25.722753 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:56:28.706155 env[1458]: time="2024-02-13T07:56:28.706024559Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:56:28.732839 env[1458]: time="2024-02-13T07:56:28.732761594Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:28.733052 kubelet[2569]: E0213 07:56:28.733005 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:56:28.733052 kubelet[2569]: E0213 07:56:28.733029 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:56:28.733052 kubelet[2569]: E0213 07:56:28.733051 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:28.733257 kubelet[2569]: E0213 07:56:28.733067 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:56:29.705123 env[1458]: time="2024-02-13T07:56:29.705072660Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:56:29.718731 env[1458]: time="2024-02-13T07:56:29.718659018Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:29.718967 kubelet[2569]: E0213 07:56:29.718857 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:56:29.718967 kubelet[2569]: E0213 07:56:29.718886 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:56:29.718967 kubelet[2569]: E0213 07:56:29.718913 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:29.718967 kubelet[2569]: E0213 07:56:29.718934 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:56:35.706605 env[1458]: time="2024-02-13T07:56:35.706476997Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:56:35.719951 env[1458]: time="2024-02-13T07:56:35.719910300Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:35.720148 kubelet[2569]: E0213 07:56:35.720106 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:56:35.720148 kubelet[2569]: E0213 07:56:35.720138 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:56:35.720338 kubelet[2569]: E0213 07:56:35.720162 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:35.720338 kubelet[2569]: E0213 07:56:35.720183 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:36.706320 env[1458]: time="2024-02-13T07:56:36.706190041Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:56:36.734938 env[1458]: time="2024-02-13T07:56:36.734902134Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:36.735185 kubelet[2569]: E0213 07:56:36.735122 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:56:36.735185 kubelet[2569]: E0213 07:56:36.735150 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:56:36.735185 kubelet[2569]: E0213 07:56:36.735172 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:36.735347 kubelet[2569]: E0213 07:56:36.735190 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:56:43.706330 env[1458]: time="2024-02-13T07:56:43.706238057Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:56:43.758225 env[1458]: time="2024-02-13T07:56:43.758131008Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:43.758508 kubelet[2569]: E0213 07:56:43.758461 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:56:43.758917 kubelet[2569]: E0213 07:56:43.758512 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:56:43.758917 kubelet[2569]: E0213 07:56:43.758570 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:43.758917 kubelet[2569]: E0213 07:56:43.758617 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:56:44.706101 env[1458]: time="2024-02-13T07:56:44.705997662Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:56:44.735669 env[1458]: time="2024-02-13T07:56:44.735603646Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:44.736010 kubelet[2569]: E0213 07:56:44.735924 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:56:44.736010 kubelet[2569]: E0213 07:56:44.735951 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:56:44.736010 kubelet[2569]: E0213 07:56:44.735973 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:44.736010 kubelet[2569]: E0213 07:56:44.735991 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:56:46.109000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.137454 kernel: kauditd_printk_skb: 34 callbacks suppressed Feb 13 07:56:46.137527 kernel: audit: type=1400 audit(1707811006.109:1173): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.109000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000cbc540 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:56:46.350495 kernel: audit: type=1300 audit(1707811006.109:1173): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000cbc540 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:56:46.350534 kernel: audit: type=1327 audit(1707811006.109:1173): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:56:46.109000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:56:46.109000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.536544 kernel: audit: type=1400 audit(1707811006.109:1174): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.536597 kernel: audit: type=1300 audit(1707811006.109:1174): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0015a47c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:56:46.109000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0015a47c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:56:46.656943 kernel: audit: type=1327 audit(1707811006.109:1174): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:56:46.109000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:56:46.750172 kernel: audit: type=1400 audit(1707811006.182:1175): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.182000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.840328 kernel: audit: type=1300 audit(1707811006.182:1175): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00af56220 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:56:46.182000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00af56220 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:56:46.939589 kernel: audit: type=1327 audit(1707811006.182:1175): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:56:46.182000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:56:47.033669 kernel: audit: type=1400 audit(1707811006.182:1176): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.182000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.182000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0095a8120 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:56:46.182000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:56:46.183000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.183000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c013fd0f60 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:56:46.183000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:56:46.924000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.924000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0095a8450 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:56:46.924000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:56:46.924000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.924000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009107aa0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:56:46.924000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:56:46.924000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:46.924000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0094fc720 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:56:46.924000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:56:48.706710 env[1458]: time="2024-02-13T07:56:48.706604592Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:56:48.733292 env[1458]: time="2024-02-13T07:56:48.733198147Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:48.733470 kubelet[2569]: E0213 07:56:48.733460 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:56:48.733623 kubelet[2569]: E0213 07:56:48.733485 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:56:48.733623 kubelet[2569]: E0213 07:56:48.733507 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:48.733623 kubelet[2569]: E0213 07:56:48.733524 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:56:49.705922 env[1458]: time="2024-02-13T07:56:49.705865320Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:56:49.718323 env[1458]: time="2024-02-13T07:56:49.718245964Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:49.718553 kubelet[2569]: E0213 07:56:49.718386 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:56:49.718553 kubelet[2569]: E0213 07:56:49.718412 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:56:49.718553 kubelet[2569]: E0213 07:56:49.718438 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:49.718553 kubelet[2569]: E0213 07:56:49.718457 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:56:50.859000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:50.859000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b23d80 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:56:50.859000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:56:50.863000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:50.863000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114f720 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:56:50.863000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:56:50.866000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:50.866000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000db51e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:56:50.866000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:56:50.869000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:56:50.869000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001abb9a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:56:50.869000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:56:55.706750 env[1458]: time="2024-02-13T07:56:55.706613267Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:56:55.734407 env[1458]: time="2024-02-13T07:56:55.734344797Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:55.734547 kubelet[2569]: E0213 07:56:55.734536 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:56:55.734735 kubelet[2569]: E0213 07:56:55.734561 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:56:55.734735 kubelet[2569]: E0213 07:56:55.734585 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:55.734735 kubelet[2569]: E0213 07:56:55.734602 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:56:58.706151 env[1458]: time="2024-02-13T07:56:58.706004397Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:56:58.757405 env[1458]: time="2024-02-13T07:56:58.757340817Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:56:58.757643 kubelet[2569]: E0213 07:56:58.757614 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:56:58.757951 kubelet[2569]: E0213 07:56:58.757668 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:56:58.757951 kubelet[2569]: E0213 07:56:58.757716 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:56:58.757951 kubelet[2569]: E0213 07:56:58.757749 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:57:02.706404 env[1458]: time="2024-02-13T07:57:02.706276545Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:57:02.706404 env[1458]: time="2024-02-13T07:57:02.706314729Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:57:02.733376 env[1458]: time="2024-02-13T07:57:02.733340775Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:02.733495 env[1458]: time="2024-02-13T07:57:02.733350438Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:02.733536 kubelet[2569]: E0213 07:57:02.733518 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:57:02.733728 kubelet[2569]: E0213 07:57:02.733544 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:57:02.733728 kubelet[2569]: E0213 07:57:02.733567 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:02.733728 kubelet[2569]: E0213 07:57:02.733585 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:57:02.733728 kubelet[2569]: E0213 07:57:02.733518 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:57:02.733728 kubelet[2569]: E0213 07:57:02.733607 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:57:02.733873 kubelet[2569]: E0213 07:57:02.733645 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:02.733873 kubelet[2569]: E0213 07:57:02.733670 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:57:10.706005 env[1458]: time="2024-02-13T07:57:10.705903144Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:57:10.756475 env[1458]: time="2024-02-13T07:57:10.756332919Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:10.756839 kubelet[2569]: E0213 07:57:10.756800 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:57:10.757489 kubelet[2569]: E0213 07:57:10.756878 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:57:10.757489 kubelet[2569]: E0213 07:57:10.756960 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:10.757489 kubelet[2569]: E0213 07:57:10.757031 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:57:13.705392 env[1458]: time="2024-02-13T07:57:13.705362865Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:57:13.705392 env[1458]: time="2024-02-13T07:57:13.705368965Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:57:13.705736 env[1458]: time="2024-02-13T07:57:13.705362854Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:57:13.722711 env[1458]: time="2024-02-13T07:57:13.722660740Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:13.722933 kubelet[2569]: E0213 07:57:13.722920 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:57:13.723126 kubelet[2569]: E0213 07:57:13.722958 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:57:13.723126 kubelet[2569]: E0213 07:57:13.722990 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:13.723126 kubelet[2569]: E0213 07:57:13.723011 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:57:13.724282 env[1458]: time="2024-02-13T07:57:13.724259656Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:13.724395 kubelet[2569]: E0213 07:57:13.724385 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:57:13.724441 kubelet[2569]: E0213 07:57:13.724404 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:57:13.724441 kubelet[2569]: E0213 07:57:13.724425 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:13.724520 kubelet[2569]: E0213 07:57:13.724442 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:57:13.724581 env[1458]: time="2024-02-13T07:57:13.724562447Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:13.724661 kubelet[2569]: E0213 07:57:13.724624 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:57:13.724661 kubelet[2569]: E0213 07:57:13.724645 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:57:13.724709 kubelet[2569]: E0213 07:57:13.724664 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:13.724709 kubelet[2569]: E0213 07:57:13.724679 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:57:21.706922 env[1458]: time="2024-02-13T07:57:21.706820113Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:57:21.733772 env[1458]: time="2024-02-13T07:57:21.733679124Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:21.733899 kubelet[2569]: E0213 07:57:21.733883 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:57:21.734057 kubelet[2569]: E0213 07:57:21.733911 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:57:21.734057 kubelet[2569]: E0213 07:57:21.733933 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:21.734057 kubelet[2569]: E0213 07:57:21.733950 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:57:24.706909 env[1458]: time="2024-02-13T07:57:24.706812125Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:57:24.733456 env[1458]: time="2024-02-13T07:57:24.733421564Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:24.733612 kubelet[2569]: E0213 07:57:24.733601 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:57:24.733806 kubelet[2569]: E0213 07:57:24.733628 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:57:24.733806 kubelet[2569]: E0213 07:57:24.733692 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:24.733806 kubelet[2569]: E0213 07:57:24.733724 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:57:25.706205 env[1458]: time="2024-02-13T07:57:25.706080177Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:57:25.720168 env[1458]: time="2024-02-13T07:57:25.720106090Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:25.720410 kubelet[2569]: E0213 07:57:25.720280 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:57:25.720410 kubelet[2569]: E0213 07:57:25.720310 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:57:25.720410 kubelet[2569]: E0213 07:57:25.720336 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:25.720410 kubelet[2569]: E0213 07:57:25.720360 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:57:27.706902 env[1458]: time="2024-02-13T07:57:27.706798713Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:57:27.733442 env[1458]: time="2024-02-13T07:57:27.733394021Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:27.733570 kubelet[2569]: E0213 07:57:27.733560 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:57:27.733777 kubelet[2569]: E0213 07:57:27.733585 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:57:27.733777 kubelet[2569]: E0213 07:57:27.733609 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:27.733777 kubelet[2569]: E0213 07:57:27.733627 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:57:34.706953 env[1458]: time="2024-02-13T07:57:34.706859667Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:57:34.733535 env[1458]: time="2024-02-13T07:57:34.733469979Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:34.733700 kubelet[2569]: E0213 07:57:34.733652 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:57:34.733700 kubelet[2569]: E0213 07:57:34.733679 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:57:34.733700 kubelet[2569]: E0213 07:57:34.733700 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:34.733922 kubelet[2569]: E0213 07:57:34.733717 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:57:35.706442 env[1458]: time="2024-02-13T07:57:35.706313943Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:57:35.733178 env[1458]: time="2024-02-13T07:57:35.733117869Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:35.733433 kubelet[2569]: E0213 07:57:35.733312 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:57:35.733433 kubelet[2569]: E0213 07:57:35.733351 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:57:35.733433 kubelet[2569]: E0213 07:57:35.733371 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:35.733433 kubelet[2569]: E0213 07:57:35.733390 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:57:38.706793 env[1458]: time="2024-02-13T07:57:38.706624213Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:57:38.760504 env[1458]: time="2024-02-13T07:57:38.760444314Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:38.760782 kubelet[2569]: E0213 07:57:38.760731 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:57:38.760782 kubelet[2569]: E0213 07:57:38.760775 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:57:38.761141 kubelet[2569]: E0213 07:57:38.760823 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:38.761141 kubelet[2569]: E0213 07:57:38.760858 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:57:42.706945 env[1458]: time="2024-02-13T07:57:42.706857635Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:57:42.761134 env[1458]: time="2024-02-13T07:57:42.760996427Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:42.761550 kubelet[2569]: E0213 07:57:42.761507 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:57:42.762266 kubelet[2569]: E0213 07:57:42.761594 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:57:42.762266 kubelet[2569]: E0213 07:57:42.761737 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:42.762266 kubelet[2569]: E0213 07:57:42.761852 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:57:46.110000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.138563 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 07:57:46.138676 kernel: audit: type=1400 audit(1707811066.110:1185): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.110000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0028a6900 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:57:46.355017 kernel: audit: type=1300 audit(1707811066.110:1185): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0028a6900 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:57:46.355090 kernel: audit: type=1327 audit(1707811066.110:1185): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:57:46.110000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:57:46.448156 kernel: audit: type=1400 audit(1707811066.110:1186): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.110000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.539968 kernel: audit: type=1300 audit(1707811066.110:1186): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001a3c4c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:57:46.110000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001a3c4c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:57:46.660397 kernel: audit: type=1327 audit(1707811066.110:1186): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:57:46.110000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:57:46.753619 kernel: audit: type=1400 audit(1707811066.182:1187): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.182000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.843607 kernel: audit: type=1300 audit(1707811066.182:1187): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e69f440 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:57:46.182000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e69f440 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:57:46.942016 kernel: audit: type=1327 audit(1707811066.182:1187): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:57:46.182000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:57:47.035317 kernel: audit: type=1400 audit(1707811066.182:1188): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.182000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.182000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00abeba70 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:57:46.182000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:57:46.183000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.183000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00a0f6510 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:57:46.183000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:57:46.925000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.925000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e68ba80 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:57:46.925000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:57:46.925000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.925000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:46.925000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c009914390 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:57:46.925000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:57:46.925000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0065541e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:57:46.925000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:57:48.707037 env[1458]: time="2024-02-13T07:57:48.706941669Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:57:48.733212 env[1458]: time="2024-02-13T07:57:48.733176350Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:48.733381 kubelet[2569]: E0213 07:57:48.733368 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:57:48.733560 kubelet[2569]: E0213 07:57:48.733398 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:57:48.733560 kubelet[2569]: E0213 07:57:48.733431 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:48.733560 kubelet[2569]: E0213 07:57:48.733460 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:57:49.705509 env[1458]: time="2024-02-13T07:57:49.705458849Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:57:49.718231 env[1458]: time="2024-02-13T07:57:49.718174477Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:49.718447 kubelet[2569]: E0213 07:57:49.718317 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:57:49.718447 kubelet[2569]: E0213 07:57:49.718341 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:57:49.718447 kubelet[2569]: E0213 07:57:49.718365 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:49.718447 kubelet[2569]: E0213 07:57:49.718384 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:57:50.860000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:50.860000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0015a4ae0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:57:50.860000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:57:50.864000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:50.864000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000ffe800 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:57:50.864000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:57:50.866000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:50.866000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0024f0060 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:57:50.866000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:57:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:57:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000ffe960 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:57:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:57:53.706691 env[1458]: time="2024-02-13T07:57:53.706529484Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:57:53.761075 env[1458]: time="2024-02-13T07:57:53.760973357Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:53.761338 kubelet[2569]: E0213 07:57:53.761284 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:57:53.761338 kubelet[2569]: E0213 07:57:53.761337 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:57:53.761819 kubelet[2569]: E0213 07:57:53.761395 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:53.761819 kubelet[2569]: E0213 07:57:53.761439 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:57:56.705735 env[1458]: time="2024-02-13T07:57:56.705700062Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:57:56.726903 env[1458]: time="2024-02-13T07:57:56.726818090Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:56.727091 kubelet[2569]: E0213 07:57:56.727072 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:57:56.727390 kubelet[2569]: E0213 07:57:56.727113 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:57:56.727390 kubelet[2569]: E0213 07:57:56.727156 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:56.727390 kubelet[2569]: E0213 07:57:56.727189 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:57:59.707706 env[1458]: time="2024-02-13T07:57:59.707592157Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:57:59.723446 env[1458]: time="2024-02-13T07:57:59.723411566Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:57:59.723602 kubelet[2569]: E0213 07:57:59.723592 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:57:59.723776 kubelet[2569]: E0213 07:57:59.723619 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:57:59.723776 kubelet[2569]: E0213 07:57:59.723654 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:57:59.723776 kubelet[2569]: E0213 07:57:59.723673 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:58:03.706996 env[1458]: time="2024-02-13T07:58:03.706859358Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:58:03.722908 env[1458]: time="2024-02-13T07:58:03.722845819Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:03.723067 kubelet[2569]: E0213 07:58:03.723019 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:58:03.723067 kubelet[2569]: E0213 07:58:03.723046 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:58:03.723250 kubelet[2569]: E0213 07:58:03.723070 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:03.723250 kubelet[2569]: E0213 07:58:03.723089 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:58:07.706844 env[1458]: time="2024-02-13T07:58:07.706754318Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:58:07.733287 env[1458]: time="2024-02-13T07:58:07.733224878Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:07.733466 kubelet[2569]: E0213 07:58:07.733423 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:58:07.733466 kubelet[2569]: E0213 07:58:07.733449 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:58:07.733643 kubelet[2569]: E0213 07:58:07.733473 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:07.733643 kubelet[2569]: E0213 07:58:07.733491 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:58:11.706613 env[1458]: time="2024-02-13T07:58:11.706516491Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:58:11.707394 env[1458]: time="2024-02-13T07:58:11.706682318Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:58:11.752540 env[1458]: time="2024-02-13T07:58:11.752473957Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:11.752807 kubelet[2569]: E0213 07:58:11.752784 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:58:11.753153 kubelet[2569]: E0213 07:58:11.752834 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:58:11.753153 kubelet[2569]: E0213 07:58:11.752882 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:11.753153 kubelet[2569]: E0213 07:58:11.752918 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:58:11.754219 env[1458]: time="2024-02-13T07:58:11.754151237Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:11.754334 kubelet[2569]: E0213 07:58:11.754313 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:58:11.754405 kubelet[2569]: E0213 07:58:11.754339 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:58:11.754405 kubelet[2569]: E0213 07:58:11.754379 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:11.754520 kubelet[2569]: E0213 07:58:11.754409 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:58:18.706715 env[1458]: time="2024-02-13T07:58:18.706496465Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:58:18.736238 env[1458]: time="2024-02-13T07:58:18.736180268Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:18.736420 kubelet[2569]: E0213 07:58:18.736386 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:58:18.736420 kubelet[2569]: E0213 07:58:18.736413 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:58:18.736595 kubelet[2569]: E0213 07:58:18.736435 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:18.736595 kubelet[2569]: E0213 07:58:18.736452 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:58:19.706614 env[1458]: time="2024-02-13T07:58:19.706519936Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:58:19.720611 env[1458]: time="2024-02-13T07:58:19.720577925Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:19.720849 kubelet[2569]: E0213 07:58:19.720788 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:58:19.720849 kubelet[2569]: E0213 07:58:19.720819 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:58:19.720849 kubelet[2569]: E0213 07:58:19.720846 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:19.720960 kubelet[2569]: E0213 07:58:19.720866 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:58:23.706629 env[1458]: time="2024-02-13T07:58:23.706485647Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:58:23.759208 env[1458]: time="2024-02-13T07:58:23.759100292Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:23.759432 kubelet[2569]: E0213 07:58:23.759391 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:58:23.759432 kubelet[2569]: E0213 07:58:23.759433 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:58:23.759803 kubelet[2569]: E0213 07:58:23.759476 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:23.759803 kubelet[2569]: E0213 07:58:23.759514 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:58:24.706827 env[1458]: time="2024-02-13T07:58:24.706721090Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:58:24.760798 env[1458]: time="2024-02-13T07:58:24.760699646Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:24.761058 kubelet[2569]: E0213 07:58:24.760999 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:58:24.761058 kubelet[2569]: E0213 07:58:24.761052 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:58:24.761510 kubelet[2569]: E0213 07:58:24.761105 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:24.761510 kubelet[2569]: E0213 07:58:24.761146 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:58:30.706779 env[1458]: time="2024-02-13T07:58:30.706620490Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:58:30.759432 env[1458]: time="2024-02-13T07:58:30.759339303Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:30.759732 kubelet[2569]: E0213 07:58:30.759672 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:58:30.759732 kubelet[2569]: E0213 07:58:30.759727 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:58:30.760188 kubelet[2569]: E0213 07:58:30.759786 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:30.760188 kubelet[2569]: E0213 07:58:30.759829 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:58:34.706125 env[1458]: time="2024-02-13T07:58:34.705978802Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:58:34.732959 env[1458]: time="2024-02-13T07:58:34.732898484Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:34.733172 kubelet[2569]: E0213 07:58:34.733129 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:58:34.733172 kubelet[2569]: E0213 07:58:34.733156 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:58:34.733343 kubelet[2569]: E0213 07:58:34.733180 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:34.733343 kubelet[2569]: E0213 07:58:34.733198 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:58:36.706425 env[1458]: time="2024-02-13T07:58:36.706298063Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:58:36.757863 env[1458]: time="2024-02-13T07:58:36.757777937Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:36.758078 kubelet[2569]: E0213 07:58:36.758028 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:58:36.758078 kubelet[2569]: E0213 07:58:36.758069 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:58:36.758450 kubelet[2569]: E0213 07:58:36.758114 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:36.758450 kubelet[2569]: E0213 07:58:36.758148 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:58:37.707034 env[1458]: time="2024-02-13T07:58:37.706912921Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:58:37.723724 env[1458]: time="2024-02-13T07:58:37.723659625Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:37.723900 kubelet[2569]: E0213 07:58:37.723859 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:58:37.723900 kubelet[2569]: E0213 07:58:37.723886 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:58:37.723971 kubelet[2569]: E0213 07:58:37.723910 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:37.723971 kubelet[2569]: E0213 07:58:37.723931 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:58:42.706343 env[1458]: time="2024-02-13T07:58:42.706245864Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:58:42.753205 env[1458]: time="2024-02-13T07:58:42.753104146Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:42.753455 kubelet[2569]: E0213 07:58:42.753428 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:58:42.753900 kubelet[2569]: E0213 07:58:42.753490 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:58:42.753900 kubelet[2569]: E0213 07:58:42.753574 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:42.753900 kubelet[2569]: E0213 07:58:42.753650 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:58:46.110000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.138995 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 07:58:46.139078 kernel: audit: type=1400 audit(1707811126.110:1197): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.110000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002a864e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:58:46.351645 kernel: audit: type=1300 audit(1707811126.110:1197): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002a864e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:58:46.351739 kernel: audit: type=1327 audit(1707811126.110:1197): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:58:46.110000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:58:46.110000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.533660 kernel: audit: type=1400 audit(1707811126.110:1198): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.533711 kernel: audit: type=1300 audit(1707811126.110:1198): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0007aa400 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:58:46.110000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0007aa400 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:58:46.654282 kernel: audit: type=1327 audit(1707811126.110:1198): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:58:46.110000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:58:46.749468 kernel: audit: type=1400 audit(1707811126.182:1199): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.182000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.182000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c004542140 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:58:46.937827 kernel: audit: type=1300 audit(1707811126.182:1199): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c004542140 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:58:46.937867 kernel: audit: type=1327 audit(1707811126.182:1199): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:58:46.182000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:58:47.030990 kernel: audit: type=1400 audit(1707811126.182:1200): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.182000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.182000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009651410 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:58:46.182000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:58:46.183000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.183000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009651470 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:58:46.183000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:58:46.925000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.925000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009651c20 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:58:46.925000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:58:46.925000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.925000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:46.925000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c005977340 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:58:46.925000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00a963470 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:58:46.925000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:58:46.925000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:58:47.706050 env[1458]: time="2024-02-13T07:58:47.705923601Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:58:47.735019 env[1458]: time="2024-02-13T07:58:47.734958759Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:47.735190 kubelet[2569]: E0213 07:58:47.735151 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:58:47.735190 kubelet[2569]: E0213 07:58:47.735177 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:58:47.735370 kubelet[2569]: E0213 07:58:47.735198 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:47.735370 kubelet[2569]: E0213 07:58:47.735216 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:58:48.706452 env[1458]: time="2024-02-13T07:58:48.706316412Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:58:48.723170 env[1458]: time="2024-02-13T07:58:48.723095666Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:48.723315 kubelet[2569]: E0213 07:58:48.723304 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:58:48.723359 kubelet[2569]: E0213 07:58:48.723329 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:58:48.723359 kubelet[2569]: E0213 07:58:48.723351 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:48.723428 kubelet[2569]: E0213 07:58:48.723368 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:58:49.705701 env[1458]: time="2024-02-13T07:58:49.705675085Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:58:49.717947 env[1458]: time="2024-02-13T07:58:49.717914654Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:49.718162 kubelet[2569]: E0213 07:58:49.718041 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:58:49.718162 kubelet[2569]: E0213 07:58:49.718062 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:58:49.718162 kubelet[2569]: E0213 07:58:49.718085 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:49.718162 kubelet[2569]: E0213 07:58:49.718103 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:58:50.860000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:50.860000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001bb7300 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:58:50.860000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:58:50.865000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:50.865000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001a3dce0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:58:50.865000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:58:50.866000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:50.866000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001a3dd20 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:58:50.866000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:58:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:58:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001b230e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:58:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:58:57.707291 env[1458]: time="2024-02-13T07:58:57.707188760Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:58:57.734756 env[1458]: time="2024-02-13T07:58:57.734694993Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:58:57.734912 kubelet[2569]: E0213 07:58:57.734882 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:58:57.735056 kubelet[2569]: E0213 07:58:57.734916 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:58:57.735056 kubelet[2569]: E0213 07:58:57.734938 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:58:57.735056 kubelet[2569]: E0213 07:58:57.734954 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:59:02.706885 env[1458]: time="2024-02-13T07:59:02.706786974Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:59:02.706885 env[1458]: time="2024-02-13T07:59:02.706811684Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:59:02.734112 env[1458]: time="2024-02-13T07:59:02.734048333Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:02.734238 kubelet[2569]: E0213 07:59:02.734227 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:59:02.734409 kubelet[2569]: E0213 07:59:02.734257 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:59:02.734409 kubelet[2569]: E0213 07:59:02.734279 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:02.734409 kubelet[2569]: E0213 07:59:02.734297 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:59:02.734409 kubelet[2569]: E0213 07:59:02.734338 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:59:02.734409 kubelet[2569]: E0213 07:59:02.734348 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:59:02.734554 env[1458]: time="2024-02-13T07:59:02.734245341Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:02.734577 kubelet[2569]: E0213 07:59:02.734365 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:02.734577 kubelet[2569]: E0213 07:59:02.734378 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:59:03.054764 systemd[1]: Started sshd@7-145.40.90.207:22-218.92.0.26:26764.service. Feb 13 07:59:03.053000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.90.207:22-218.92.0.26:26764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:59:03.082415 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 07:59:03.082469 kernel: audit: type=1130 audit(1707811143.053:1209): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.90.207:22-218.92.0.26:26764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:59:03.266076 sshd[5248]: Unable to negotiate with 218.92.0.26 port 26764: no matching key exchange method found. Their offer: diffie-hellman-group1-sha1,diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha1 [preauth] Feb 13 07:59:03.268039 systemd[1]: sshd@7-145.40.90.207:22-218.92.0.26:26764.service: Deactivated successfully. Feb 13 07:59:03.267000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.90.207:22-218.92.0.26:26764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:59:03.358802 kernel: audit: type=1131 audit(1707811143.267:1210): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.90.207:22-218.92.0.26:26764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 07:59:04.706124 env[1458]: time="2024-02-13T07:59:04.705985447Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:59:04.732683 env[1458]: time="2024-02-13T07:59:04.732610573Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:04.732853 kubelet[2569]: E0213 07:59:04.732811 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:59:04.732853 kubelet[2569]: E0213 07:59:04.732838 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:59:04.733036 kubelet[2569]: E0213 07:59:04.732860 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:04.733036 kubelet[2569]: E0213 07:59:04.732878 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:59:12.706972 env[1458]: time="2024-02-13T07:59:12.706849804Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:59:12.761109 env[1458]: time="2024-02-13T07:59:12.760960092Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:12.761488 kubelet[2569]: E0213 07:59:12.761444 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:59:12.762151 kubelet[2569]: E0213 07:59:12.761526 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:59:12.762151 kubelet[2569]: E0213 07:59:12.761618 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:12.762151 kubelet[2569]: E0213 07:59:12.761719 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:59:15.706897 env[1458]: time="2024-02-13T07:59:15.706812913Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:59:15.707726 env[1458]: time="2024-02-13T07:59:15.706859392Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:59:15.723130 env[1458]: time="2024-02-13T07:59:15.723005055Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:15.723130 env[1458]: time="2024-02-13T07:59:15.723005140Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:15.723331 kubelet[2569]: E0213 07:59:15.723317 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:59:15.723505 kubelet[2569]: E0213 07:59:15.723349 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:59:15.723505 kubelet[2569]: E0213 07:59:15.723316 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:59:15.723505 kubelet[2569]: E0213 07:59:15.723390 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:15.723505 kubelet[2569]: E0213 07:59:15.723403 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:59:15.723505 kubelet[2569]: E0213 07:59:15.723411 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:59:15.723659 kubelet[2569]: E0213 07:59:15.723423 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:15.723659 kubelet[2569]: E0213 07:59:15.723437 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:59:17.707045 env[1458]: time="2024-02-13T07:59:17.706901611Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:59:17.733792 env[1458]: time="2024-02-13T07:59:17.733741145Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:17.734051 kubelet[2569]: E0213 07:59:17.734011 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:59:17.734051 kubelet[2569]: E0213 07:59:17.734037 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:59:17.734221 kubelet[2569]: E0213 07:59:17.734057 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:17.734221 kubelet[2569]: E0213 07:59:17.734074 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:59:25.706456 env[1458]: time="2024-02-13T07:59:25.706352868Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:59:25.760126 env[1458]: time="2024-02-13T07:59:25.760052119Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:25.760434 kubelet[2569]: E0213 07:59:25.760375 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:59:25.760434 kubelet[2569]: E0213 07:59:25.760428 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:59:25.760950 kubelet[2569]: E0213 07:59:25.760486 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:25.760950 kubelet[2569]: E0213 07:59:25.760527 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:59:26.706111 env[1458]: time="2024-02-13T07:59:26.705980709Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:59:26.756414 env[1458]: time="2024-02-13T07:59:26.756328182Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:26.756799 kubelet[2569]: E0213 07:59:26.756579 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:59:26.756799 kubelet[2569]: E0213 07:59:26.756621 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:59:26.756799 kubelet[2569]: E0213 07:59:26.756675 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:26.756799 kubelet[2569]: E0213 07:59:26.756713 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:59:29.705922 env[1458]: time="2024-02-13T07:59:29.705895335Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:59:29.719020 env[1458]: time="2024-02-13T07:59:29.718980380Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:29.719222 kubelet[2569]: E0213 07:59:29.719190 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:59:29.719222 kubelet[2569]: E0213 07:59:29.719219 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:59:29.719433 kubelet[2569]: E0213 07:59:29.719247 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:29.719433 kubelet[2569]: E0213 07:59:29.719269 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:59:30.707024 env[1458]: time="2024-02-13T07:59:30.706882868Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:59:30.759690 env[1458]: time="2024-02-13T07:59:30.759608048Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:30.759939 kubelet[2569]: E0213 07:59:30.759889 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:59:30.759939 kubelet[2569]: E0213 07:59:30.759932 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:59:30.760304 kubelet[2569]: E0213 07:59:30.759978 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:30.760304 kubelet[2569]: E0213 07:59:30.760013 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:59:40.706512 env[1458]: time="2024-02-13T07:59:40.706387978Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:59:40.759236 env[1458]: time="2024-02-13T07:59:40.759129288Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:40.759497 kubelet[2569]: E0213 07:59:40.759447 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:59:40.759497 kubelet[2569]: E0213 07:59:40.759496 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:59:40.759890 kubelet[2569]: E0213 07:59:40.759557 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:40.759890 kubelet[2569]: E0213 07:59:40.759590 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:59:41.706125 env[1458]: time="2024-02-13T07:59:41.706032315Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:59:41.758138 env[1458]: time="2024-02-13T07:59:41.758075211Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:41.758528 kubelet[2569]: E0213 07:59:41.758323 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:59:41.758528 kubelet[2569]: E0213 07:59:41.758368 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:59:41.758528 kubelet[2569]: E0213 07:59:41.758420 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:41.758528 kubelet[2569]: E0213 07:59:41.758457 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:59:42.706340 env[1458]: time="2024-02-13T07:59:42.706222662Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:59:42.754880 env[1458]: time="2024-02-13T07:59:42.754796486Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:42.755113 kubelet[2569]: E0213 07:59:42.755064 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:59:42.755113 kubelet[2569]: E0213 07:59:42.755107 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:59:42.755459 kubelet[2569]: E0213 07:59:42.755151 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:42.755459 kubelet[2569]: E0213 07:59:42.755184 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:59:44.706858 env[1458]: time="2024-02-13T07:59:44.706715433Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:59:44.756333 env[1458]: time="2024-02-13T07:59:44.756268634Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:44.756618 kubelet[2569]: E0213 07:59:44.756595 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:59:44.756923 kubelet[2569]: E0213 07:59:44.756666 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:59:44.756923 kubelet[2569]: E0213 07:59:44.756735 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:44.756923 kubelet[2569]: E0213 07:59:44.756792 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 07:59:46.111000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.111000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000652680 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:59:46.326057 kernel: audit: type=1400 audit(1707811186.111:1211): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.326096 kernel: audit: type=1300 audit(1707811186.111:1211): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000652680 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:59:46.326113 kernel: audit: type=1327 audit(1707811186.111:1211): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:59:46.111000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:59:46.112000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.112000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0018d2090 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:59:46.632777 kernel: audit: type=1400 audit(1707811186.112:1212): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.632808 kernel: audit: type=1300 audit(1707811186.112:1212): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0018d2090 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:59:46.632827 kernel: audit: type=1327 audit(1707811186.112:1212): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:59:46.112000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:59:46.726170 kernel: audit: type=1400 audit(1707811186.183:1213): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.183000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.817350 kernel: audit: type=1300 audit(1707811186.183:1213): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008bbe1b0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:59:46.183000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008bbe1b0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:59:46.916319 kernel: audit: type=1327 audit(1707811186.183:1213): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:59:46.183000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:59:47.009951 kernel: audit: type=1400 audit(1707811186.183:1214): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.183000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.183000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e5fb7e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:59:46.183000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:59:46.184000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.184000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008bbe2d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:59:46.184000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:59:46.926000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.926000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e73e6c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:59:46.926000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:59:46.926000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.926000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e73e720 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:59:46.926000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:59:46.926000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:46.926000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e4441a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 07:59:46.926000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 07:59:50.862000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:50.862000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114f5c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:59:50.862000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:59:50.867000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:50.867000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114f5e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:59:50.867000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:59:50.868000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:50.868000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114f620 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:59:50.868000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:59:50.872000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 07:59:50.872000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0005d0240 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 07:59:50.872000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 07:59:54.706523 env[1458]: time="2024-02-13T07:59:54.706366073Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 07:59:54.723319 env[1458]: time="2024-02-13T07:59:54.723256668Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:54.723464 kubelet[2569]: E0213 07:59:54.723430 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 07:59:54.723464 kubelet[2569]: E0213 07:59:54.723456 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 07:59:54.723660 kubelet[2569]: E0213 07:59:54.723479 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:54.723660 kubelet[2569]: E0213 07:59:54.723497 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 07:59:55.706550 env[1458]: time="2024-02-13T07:59:55.706455644Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 07:59:55.722424 env[1458]: time="2024-02-13T07:59:55.722362720Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:55.722624 kubelet[2569]: E0213 07:59:55.722611 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 07:59:55.722673 kubelet[2569]: E0213 07:59:55.722649 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 07:59:55.722702 kubelet[2569]: E0213 07:59:55.722684 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:55.722759 kubelet[2569]: E0213 07:59:55.722712 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 07:59:56.706584 env[1458]: time="2024-02-13T07:59:56.706471071Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 07:59:56.733790 env[1458]: time="2024-02-13T07:59:56.733725263Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:56.733897 kubelet[2569]: E0213 07:59:56.733877 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 07:59:56.734052 kubelet[2569]: E0213 07:59:56.733905 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 07:59:56.734052 kubelet[2569]: E0213 07:59:56.733928 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:56.734052 kubelet[2569]: E0213 07:59:56.733945 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 07:59:57.706137 env[1458]: time="2024-02-13T07:59:57.706013852Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 07:59:57.721140 env[1458]: time="2024-02-13T07:59:57.721101524Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 07:59:57.721371 kubelet[2569]: E0213 07:59:57.721266 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 07:59:57.721371 kubelet[2569]: E0213 07:59:57.721303 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 07:59:57.721371 kubelet[2569]: E0213 07:59:57.721330 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 07:59:57.721371 kubelet[2569]: E0213 07:59:57.721350 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:00:06.706039 env[1458]: time="2024-02-13T08:00:06.705934412Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:00:06.732898 env[1458]: time="2024-02-13T08:00:06.732863115Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:06.733092 kubelet[2569]: E0213 08:00:06.733047 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:00:06.733092 kubelet[2569]: E0213 08:00:06.733076 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:00:06.733267 kubelet[2569]: E0213 08:00:06.733098 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:06.733267 kubelet[2569]: E0213 08:00:06.733114 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:00:08.706058 env[1458]: time="2024-02-13T08:00:08.705950006Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:00:08.735981 env[1458]: time="2024-02-13T08:00:08.735922525Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:08.736175 kubelet[2569]: E0213 08:00:08.736135 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:00:08.736175 kubelet[2569]: E0213 08:00:08.736165 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:00:08.736356 kubelet[2569]: E0213 08:00:08.736186 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:08.736356 kubelet[2569]: E0213 08:00:08.736204 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:00:10.706259 env[1458]: time="2024-02-13T08:00:10.706170222Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:00:10.750212 env[1458]: time="2024-02-13T08:00:10.750175119Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:10.750380 kubelet[2569]: E0213 08:00:10.750367 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:00:10.750563 kubelet[2569]: E0213 08:00:10.750396 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:00:10.750563 kubelet[2569]: E0213 08:00:10.750422 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:10.750563 kubelet[2569]: E0213 08:00:10.750442 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:00:11.707049 env[1458]: time="2024-02-13T08:00:11.706925786Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:00:11.723196 env[1458]: time="2024-02-13T08:00:11.723148625Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:11.723409 kubelet[2569]: E0213 08:00:11.723374 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:00:11.723409 kubelet[2569]: E0213 08:00:11.723401 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:00:11.723466 kubelet[2569]: E0213 08:00:11.723426 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:11.723466 kubelet[2569]: E0213 08:00:11.723443 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:00:18.706453 env[1458]: time="2024-02-13T08:00:18.706347160Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:00:18.726762 env[1458]: time="2024-02-13T08:00:18.726726893Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:18.726982 kubelet[2569]: E0213 08:00:18.726938 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:00:18.726982 kubelet[2569]: E0213 08:00:18.726975 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:00:18.727179 kubelet[2569]: E0213 08:00:18.727009 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:18.727179 kubelet[2569]: E0213 08:00:18.727033 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:00:20.706198 env[1458]: time="2024-02-13T08:00:20.706079180Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:00:20.725362 env[1458]: time="2024-02-13T08:00:20.725326573Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:20.725473 kubelet[2569]: E0213 08:00:20.725463 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:00:20.725628 kubelet[2569]: E0213 08:00:20.725487 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:00:20.725628 kubelet[2569]: E0213 08:00:20.725510 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:20.725628 kubelet[2569]: E0213 08:00:20.725526 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:00:23.706971 env[1458]: time="2024-02-13T08:00:23.706880168Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:00:23.706971 env[1458]: time="2024-02-13T08:00:23.706890870Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:00:23.767964 env[1458]: time="2024-02-13T08:00:23.767801341Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:23.768440 kubelet[2569]: E0213 08:00:23.768357 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:00:23.768440 kubelet[2569]: E0213 08:00:23.768443 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:00:23.769309 kubelet[2569]: E0213 08:00:23.768544 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:23.769309 kubelet[2569]: E0213 08:00:23.768626 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:00:23.770104 env[1458]: time="2024-02-13T08:00:23.769946871Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:23.770441 kubelet[2569]: E0213 08:00:23.770384 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:00:23.770615 kubelet[2569]: E0213 08:00:23.770461 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:00:23.770615 kubelet[2569]: E0213 08:00:23.770565 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:23.771001 kubelet[2569]: E0213 08:00:23.770670 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:00:29.705746 env[1458]: time="2024-02-13T08:00:29.705684802Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:00:29.718547 env[1458]: time="2024-02-13T08:00:29.718472487Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:29.718732 kubelet[2569]: E0213 08:00:29.718684 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:00:29.718732 kubelet[2569]: E0213 08:00:29.718714 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:00:29.718957 kubelet[2569]: E0213 08:00:29.718739 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:29.718957 kubelet[2569]: E0213 08:00:29.718759 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:00:31.707104 env[1458]: time="2024-02-13T08:00:31.706972668Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:00:31.733177 env[1458]: time="2024-02-13T08:00:31.733114268Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:31.733327 kubelet[2569]: E0213 08:00:31.733315 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:00:31.733497 kubelet[2569]: E0213 08:00:31.733345 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:00:31.733497 kubelet[2569]: E0213 08:00:31.733379 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:31.733497 kubelet[2569]: E0213 08:00:31.733405 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:00:35.706850 env[1458]: time="2024-02-13T08:00:35.706751520Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:00:35.736636 env[1458]: time="2024-02-13T08:00:35.736573265Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:35.736850 kubelet[2569]: E0213 08:00:35.736838 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:00:35.737025 kubelet[2569]: E0213 08:00:35.736866 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:00:35.737025 kubelet[2569]: E0213 08:00:35.736898 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:35.737025 kubelet[2569]: E0213 08:00:35.736924 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:00:38.707037 env[1458]: time="2024-02-13T08:00:38.706907991Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:00:38.733881 env[1458]: time="2024-02-13T08:00:38.733819760Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:38.733993 kubelet[2569]: E0213 08:00:38.733980 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:00:38.734142 kubelet[2569]: E0213 08:00:38.734007 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:00:38.734142 kubelet[2569]: E0213 08:00:38.734029 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:38.734142 kubelet[2569]: E0213 08:00:38.734047 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:00:43.706980 env[1458]: time="2024-02-13T08:00:43.706849285Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:00:43.706980 env[1458]: time="2024-02-13T08:00:43.706867444Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:00:43.756921 env[1458]: time="2024-02-13T08:00:43.756845868Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:43.757183 kubelet[2569]: E0213 08:00:43.757161 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:00:43.757584 kubelet[2569]: E0213 08:00:43.757207 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:00:43.757584 kubelet[2569]: E0213 08:00:43.757252 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:43.757584 kubelet[2569]: E0213 08:00:43.757287 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:00:43.757878 env[1458]: time="2024-02-13T08:00:43.757830274Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:43.758017 kubelet[2569]: E0213 08:00:43.758003 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:00:43.758337 kubelet[2569]: E0213 08:00:43.758030 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:00:43.758337 kubelet[2569]: E0213 08:00:43.758064 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:43.758337 kubelet[2569]: E0213 08:00:43.758094 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:00:46.112000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.140678 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:00:46.140751 kernel: audit: type=1400 audit(1707811246.112:1223): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.112000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0011485d0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:00:46.355171 kernel: audit: type=1300 audit(1707811246.112:1223): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0011485d0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:00:46.355202 kernel: audit: type=1327 audit(1707811246.112:1223): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:00:46.112000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:00:46.448332 kernel: audit: type=1400 audit(1707811246.112:1224): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.112000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.538443 kernel: audit: type=1300 audit(1707811246.112:1224): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001d0b780 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:00:46.112000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001d0b780 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:00:46.112000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:00:46.659703 kernel: audit: type=1327 audit(1707811246.112:1224): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:00:46.184000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.843028 kernel: audit: type=1400 audit(1707811246.184:1225): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.843059 kernel: audit: type=1400 audit(1707811246.184:1226): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.184000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.933122 kernel: audit: type=1300 audit(1707811246.184:1225): arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c013fd0930 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:00:46.184000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c013fd0930 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:00:47.031423 kernel: audit: type=1300 audit(1707811246.184:1226): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008c432a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:00:46.184000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008c432a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:00:46.184000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:00:46.184000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:00:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c006516090 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:00:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:00:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c006516480 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:00:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:00:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00a270c60 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:00:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:00:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c005a630c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:00:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:00:47.706952 env[1458]: time="2024-02-13T08:00:47.706829706Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:00:47.756362 env[1458]: time="2024-02-13T08:00:47.756272646Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:47.756530 kubelet[2569]: E0213 08:00:47.756509 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:00:47.756872 kubelet[2569]: E0213 08:00:47.756551 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:00:47.756872 kubelet[2569]: E0213 08:00:47.756595 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:47.756872 kubelet[2569]: E0213 08:00:47.756639 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:00:50.863000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:50.863000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0005d0dc0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:00:50.863000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:00:50.868000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:50.868000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001d0baa0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:00:50.868000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:00:50.869000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:50.869000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0005d0e00 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:00:50.869000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:00:50.873000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:00:50.873000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001d0bbe0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:00:50.873000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:00:52.706589 env[1458]: time="2024-02-13T08:00:52.706504251Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:00:52.732795 env[1458]: time="2024-02-13T08:00:52.732732047Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:52.732964 kubelet[2569]: E0213 08:00:52.732927 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:00:52.732964 kubelet[2569]: E0213 08:00:52.732952 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:00:52.733135 kubelet[2569]: E0213 08:00:52.732974 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:52.733135 kubelet[2569]: E0213 08:00:52.732992 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:00:54.706145 env[1458]: time="2024-02-13T08:00:54.706014907Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:00:54.759133 env[1458]: time="2024-02-13T08:00:54.759030587Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:54.759377 kubelet[2569]: E0213 08:00:54.759327 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:00:54.759377 kubelet[2569]: E0213 08:00:54.759377 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:00:54.759844 kubelet[2569]: E0213 08:00:54.759430 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:54.759844 kubelet[2569]: E0213 08:00:54.759471 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:00:58.706963 env[1458]: time="2024-02-13T08:00:58.706864504Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:00:58.733141 env[1458]: time="2024-02-13T08:00:58.733059853Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:00:58.733334 kubelet[2569]: E0213 08:00:58.733287 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:00:58.733334 kubelet[2569]: E0213 08:00:58.733312 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:00:58.733334 kubelet[2569]: E0213 08:00:58.733333 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:00:58.733553 kubelet[2569]: E0213 08:00:58.733352 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:01:02.707748 env[1458]: time="2024-02-13T08:01:02.707588993Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:01:02.722158 env[1458]: time="2024-02-13T08:01:02.722091141Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:02.722284 kubelet[2569]: E0213 08:01:02.722273 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:01:02.722485 kubelet[2569]: E0213 08:01:02.722305 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:01:02.722485 kubelet[2569]: E0213 08:01:02.722338 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:02.722485 kubelet[2569]: E0213 08:01:02.722368 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:01:06.706267 env[1458]: time="2024-02-13T08:01:06.706219098Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:01:06.723099 env[1458]: time="2024-02-13T08:01:06.723026837Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:06.723218 kubelet[2569]: E0213 08:01:06.723205 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:01:06.723370 kubelet[2569]: E0213 08:01:06.723237 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:01:06.723370 kubelet[2569]: E0213 08:01:06.723261 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:06.723370 kubelet[2569]: E0213 08:01:06.723278 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:01:09.705930 env[1458]: time="2024-02-13T08:01:09.705903158Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:01:09.706168 env[1458]: time="2024-02-13T08:01:09.705935463Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:01:09.719457 env[1458]: time="2024-02-13T08:01:09.719391135Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:09.719591 env[1458]: time="2024-02-13T08:01:09.719511187Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:09.719621 kubelet[2569]: E0213 08:01:09.719573 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:01:09.719621 kubelet[2569]: E0213 08:01:09.719603 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:01:09.719621 kubelet[2569]: E0213 08:01:09.719605 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:01:09.719621 kubelet[2569]: E0213 08:01:09.719620 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:01:09.719863 kubelet[2569]: E0213 08:01:09.719648 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:09.719863 kubelet[2569]: E0213 08:01:09.719649 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:09.719863 kubelet[2569]: E0213 08:01:09.719670 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:01:09.719988 kubelet[2569]: E0213 08:01:09.719670 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:01:13.030921 update_engine[1448]: I0213 08:01:13.030817 1448 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Feb 13 08:01:13.030921 update_engine[1448]: I0213 08:01:13.030885 1448 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Feb 13 08:01:13.032569 update_engine[1448]: I0213 08:01:13.032491 1448 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Feb 13 08:01:13.033366 update_engine[1448]: I0213 08:01:13.033298 1448 omaha_request_params.cc:62] Current group set to lts Feb 13 08:01:13.033598 update_engine[1448]: I0213 08:01:13.033566 1448 update_attempter.cc:499] Already updated boot flags. Skipping. Feb 13 08:01:13.033598 update_engine[1448]: I0213 08:01:13.033584 1448 update_attempter.cc:643] Scheduling an action processor start. Feb 13 08:01:13.033961 update_engine[1448]: I0213 08:01:13.033614 1448 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Feb 13 08:01:13.033961 update_engine[1448]: I0213 08:01:13.033701 1448 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Feb 13 08:01:13.033961 update_engine[1448]: I0213 08:01:13.033841 1448 omaha_request_action.cc:270] Posting an Omaha request to disabled Feb 13 08:01:13.033961 update_engine[1448]: I0213 08:01:13.033862 1448 omaha_request_action.cc:271] Request: Feb 13 08:01:13.033961 update_engine[1448]: Feb 13 08:01:13.033961 update_engine[1448]: Feb 13 08:01:13.033961 update_engine[1448]: Feb 13 08:01:13.033961 update_engine[1448]: Feb 13 08:01:13.033961 update_engine[1448]: Feb 13 08:01:13.033961 update_engine[1448]: Feb 13 08:01:13.033961 update_engine[1448]: Feb 13 08:01:13.033961 update_engine[1448]: Feb 13 08:01:13.033961 update_engine[1448]: I0213 08:01:13.033879 1448 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:01:13.035074 locksmithd[1496]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Feb 13 08:01:13.036760 update_engine[1448]: I0213 08:01:13.036710 1448 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:01:13.036951 update_engine[1448]: E0213 08:01:13.036916 1448 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:01:13.037086 update_engine[1448]: I0213 08:01:13.037060 1448 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Feb 13 08:01:14.706203 env[1458]: time="2024-02-13T08:01:14.706098553Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:01:14.762149 env[1458]: time="2024-02-13T08:01:14.762028371Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:14.762470 kubelet[2569]: E0213 08:01:14.762409 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:01:14.762973 kubelet[2569]: E0213 08:01:14.762478 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:01:14.762973 kubelet[2569]: E0213 08:01:14.762559 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:14.762973 kubelet[2569]: E0213 08:01:14.762618 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:01:18.706629 env[1458]: time="2024-02-13T08:01:18.706455616Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:01:18.759584 env[1458]: time="2024-02-13T08:01:18.759488611Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:18.759789 kubelet[2569]: E0213 08:01:18.759757 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:01:18.760171 kubelet[2569]: E0213 08:01:18.759807 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:01:18.760171 kubelet[2569]: E0213 08:01:18.759861 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:18.760171 kubelet[2569]: E0213 08:01:18.759902 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:01:21.706704 env[1458]: time="2024-02-13T08:01:21.706600069Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:01:21.758505 env[1458]: time="2024-02-13T08:01:21.758419288Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:21.758761 kubelet[2569]: E0213 08:01:21.758704 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:01:21.758761 kubelet[2569]: E0213 08:01:21.758754 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:01:21.759124 kubelet[2569]: E0213 08:01:21.758800 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:21.759124 kubelet[2569]: E0213 08:01:21.758836 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:01:22.706378 env[1458]: time="2024-02-13T08:01:22.706234928Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:01:22.733300 env[1458]: time="2024-02-13T08:01:22.733245988Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:22.733555 kubelet[2569]: E0213 08:01:22.733492 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:01:22.733555 kubelet[2569]: E0213 08:01:22.733519 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:01:22.733555 kubelet[2569]: E0213 08:01:22.733543 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:22.733670 kubelet[2569]: E0213 08:01:22.733561 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:01:22.941294 update_engine[1448]: I0213 08:01:22.941176 1448 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:01:22.942099 update_engine[1448]: I0213 08:01:22.941697 1448 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:01:22.942099 update_engine[1448]: E0213 08:01:22.941905 1448 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:01:22.942099 update_engine[1448]: I0213 08:01:22.942071 1448 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Feb 13 08:01:26.705896 env[1458]: time="2024-02-13T08:01:26.705829924Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:01:26.722741 env[1458]: time="2024-02-13T08:01:26.722670899Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:26.722841 kubelet[2569]: E0213 08:01:26.722818 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:01:26.722988 kubelet[2569]: E0213 08:01:26.722845 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:01:26.722988 kubelet[2569]: E0213 08:01:26.722865 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:26.722988 kubelet[2569]: E0213 08:01:26.722884 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:01:32.706163 env[1458]: time="2024-02-13T08:01:32.706022400Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:01:32.734809 env[1458]: time="2024-02-13T08:01:32.734743072Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:32.734978 kubelet[2569]: E0213 08:01:32.734937 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:01:32.734978 kubelet[2569]: E0213 08:01:32.734963 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:01:32.735159 kubelet[2569]: E0213 08:01:32.734985 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:32.735159 kubelet[2569]: E0213 08:01:32.735003 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:01:32.941440 update_engine[1448]: I0213 08:01:32.941313 1448 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:01:32.942205 update_engine[1448]: I0213 08:01:32.941826 1448 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:01:32.942205 update_engine[1448]: E0213 08:01:32.942026 1448 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:01:32.942205 update_engine[1448]: I0213 08:01:32.942198 1448 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Feb 13 08:01:33.706420 env[1458]: time="2024-02-13T08:01:33.706310478Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:01:33.721335 env[1458]: time="2024-02-13T08:01:33.721295529Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:33.721485 kubelet[2569]: E0213 08:01:33.721462 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:01:33.721542 kubelet[2569]: E0213 08:01:33.721489 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:01:33.721542 kubelet[2569]: E0213 08:01:33.721517 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:33.721542 kubelet[2569]: E0213 08:01:33.721537 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:01:35.707000 env[1458]: time="2024-02-13T08:01:35.706876378Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:01:35.721894 env[1458]: time="2024-02-13T08:01:35.721856656Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:35.722065 kubelet[2569]: E0213 08:01:35.722052 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:01:35.722232 kubelet[2569]: E0213 08:01:35.722081 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:01:35.722232 kubelet[2569]: E0213 08:01:35.722106 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:35.722232 kubelet[2569]: E0213 08:01:35.722125 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:01:41.706058 env[1458]: time="2024-02-13T08:01:41.705902011Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:01:41.753357 env[1458]: time="2024-02-13T08:01:41.753275104Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:41.753550 kubelet[2569]: E0213 08:01:41.753532 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:01:41.753859 kubelet[2569]: E0213 08:01:41.753573 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:01:41.753859 kubelet[2569]: E0213 08:01:41.753613 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:41.753859 kubelet[2569]: E0213 08:01:41.753659 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:01:42.941564 update_engine[1448]: I0213 08:01:42.941445 1448 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:01:42.942397 update_engine[1448]: I0213 08:01:42.941951 1448 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:01:42.942397 update_engine[1448]: E0213 08:01:42.942157 1448 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:01:42.942397 update_engine[1448]: I0213 08:01:42.942306 1448 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Feb 13 08:01:42.942397 update_engine[1448]: I0213 08:01:42.942322 1448 omaha_request_action.cc:621] Omaha request response: Feb 13 08:01:42.942835 update_engine[1448]: E0213 08:01:42.942468 1448 omaha_request_action.cc:640] Omaha request network transfer failed. Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942497 1448 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942507 1448 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942515 1448 update_attempter.cc:306] Processing Done. Feb 13 08:01:42.942835 update_engine[1448]: E0213 08:01:42.942541 1448 update_attempter.cc:619] Update failed. Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942551 1448 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942560 1448 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942570 1448 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942750 1448 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942803 1448 omaha_request_action.cc:270] Posting an Omaha request to disabled Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942813 1448 omaha_request_action.cc:271] Request: Feb 13 08:01:42.942835 update_engine[1448]: Feb 13 08:01:42.942835 update_engine[1448]: Feb 13 08:01:42.942835 update_engine[1448]: Feb 13 08:01:42.942835 update_engine[1448]: Feb 13 08:01:42.942835 update_engine[1448]: Feb 13 08:01:42.942835 update_engine[1448]: Feb 13 08:01:42.942835 update_engine[1448]: I0213 08:01:42.942822 1448 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:01:42.944465 update_engine[1448]: I0213 08:01:42.943125 1448 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:01:42.944465 update_engine[1448]: E0213 08:01:42.943363 1448 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:01:42.944465 update_engine[1448]: I0213 08:01:42.943555 1448 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Feb 13 08:01:42.944465 update_engine[1448]: I0213 08:01:42.943576 1448 omaha_request_action.cc:621] Omaha request response: Feb 13 08:01:42.944465 update_engine[1448]: I0213 08:01:42.943588 1448 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:01:42.944465 update_engine[1448]: I0213 08:01:42.943596 1448 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:01:42.944465 update_engine[1448]: I0213 08:01:42.943604 1448 update_attempter.cc:306] Processing Done. Feb 13 08:01:42.944465 update_engine[1448]: I0213 08:01:42.943612 1448 update_attempter.cc:310] Error event sent. Feb 13 08:01:42.944465 update_engine[1448]: I0213 08:01:42.943655 1448 update_check_scheduler.cc:74] Next update check in 48m8s Feb 13 08:01:42.945300 locksmithd[1496]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Feb 13 08:01:42.945300 locksmithd[1496]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Feb 13 08:01:43.706540 env[1458]: time="2024-02-13T08:01:43.706444258Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:01:43.758600 env[1458]: time="2024-02-13T08:01:43.758542351Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:43.758868 kubelet[2569]: E0213 08:01:43.758819 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:01:43.758868 kubelet[2569]: E0213 08:01:43.758861 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:01:43.759219 kubelet[2569]: E0213 08:01:43.758904 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:43.759219 kubelet[2569]: E0213 08:01:43.758939 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:01:44.706117 env[1458]: time="2024-02-13T08:01:44.706021613Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:01:44.755507 env[1458]: time="2024-02-13T08:01:44.755446161Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:44.755918 kubelet[2569]: E0213 08:01:44.755703 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:01:44.755918 kubelet[2569]: E0213 08:01:44.755746 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:01:44.755918 kubelet[2569]: E0213 08:01:44.755790 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:44.755918 kubelet[2569]: E0213 08:01:44.755828 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:01:46.113000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.142154 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:01:46.142229 kernel: audit: type=1400 audit(1707811306.113:1236): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.113000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.326128 kernel: audit: type=1400 audit(1707811306.113:1235): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.326159 kernel: audit: type=1300 audit(1707811306.113:1236): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006a5d60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:01:46.113000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006a5d60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:01:46.113000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001078990 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:01:46.113000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:01:46.667156 kernel: audit: type=1300 audit(1707811306.113:1235): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001078990 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:01:46.667191 kernel: audit: type=1327 audit(1707811306.113:1235): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:01:46.667209 kernel: audit: type=1327 audit(1707811306.113:1236): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:01:46.113000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:01:46.761391 kernel: audit: type=1400 audit(1707811306.185:1237): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.854003 kernel: audit: type=1300 audit(1707811306.185:1237): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00a50acf0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:01:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00a50acf0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:01:46.954115 kernel: audit: type=1327 audit(1707811306.185:1237): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:01:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:01:47.049946 kernel: audit: type=1400 audit(1707811306.185:1238): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c004543060 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:01:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:01:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0116803c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:01:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:01:46.928000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.928000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e6ccc40 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:01:46.928000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:01:46.928000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.928000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c011680780 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:01:46.928000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:01:46.928000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:46.928000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c005b79ad0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:01:46.928000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:01:50.706661 env[1458]: time="2024-02-13T08:01:50.706520775Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:01:50.733404 env[1458]: time="2024-02-13T08:01:50.733306109Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:50.733579 kubelet[2569]: E0213 08:01:50.733568 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:01:50.733778 kubelet[2569]: E0213 08:01:50.733597 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:01:50.733778 kubelet[2569]: E0213 08:01:50.733622 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:50.733778 kubelet[2569]: E0213 08:01:50.733645 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:01:50.864000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:50.864000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0007ab9c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:01:50.864000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:01:50.869000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:50.869000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0007ab9e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:01:50.869000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:01:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0007aba20 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:01:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:01:50.873000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:01:50.873000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001aba8e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:01:50.873000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:01:52.706672 env[1458]: time="2024-02-13T08:01:52.706544001Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:01:52.733082 env[1458]: time="2024-02-13T08:01:52.733047821Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:52.733229 kubelet[2569]: E0213 08:01:52.733218 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:01:52.733386 kubelet[2569]: E0213 08:01:52.733246 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:01:52.733386 kubelet[2569]: E0213 08:01:52.733269 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:52.733386 kubelet[2569]: E0213 08:01:52.733287 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:01:58.706691 env[1458]: time="2024-02-13T08:01:58.706563951Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:01:58.733184 env[1458]: time="2024-02-13T08:01:58.733122377Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:58.733340 kubelet[2569]: E0213 08:01:58.733328 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:01:58.733510 kubelet[2569]: E0213 08:01:58.733354 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:01:58.733510 kubelet[2569]: E0213 08:01:58.733378 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:58.733510 kubelet[2569]: E0213 08:01:58.733394 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:01:59.707548 env[1458]: time="2024-02-13T08:01:59.707414597Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:01:59.734341 env[1458]: time="2024-02-13T08:01:59.734306119Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:01:59.734493 kubelet[2569]: E0213 08:01:59.734479 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:01:59.734716 kubelet[2569]: E0213 08:01:59.734508 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:01:59.734716 kubelet[2569]: E0213 08:01:59.734537 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:01:59.734716 kubelet[2569]: E0213 08:01:59.734556 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:02:03.706810 env[1458]: time="2024-02-13T08:02:03.706682033Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:02:03.733448 env[1458]: time="2024-02-13T08:02:03.733389674Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:03.733580 kubelet[2569]: E0213 08:02:03.733569 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:02:03.733734 kubelet[2569]: E0213 08:02:03.733597 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:02:03.733734 kubelet[2569]: E0213 08:02:03.733620 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:03.733734 kubelet[2569]: E0213 08:02:03.733642 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:02:05.709938 env[1458]: time="2024-02-13T08:02:05.709834704Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:02:05.723739 env[1458]: time="2024-02-13T08:02:05.723701622Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:05.723889 kubelet[2569]: E0213 08:02:05.723876 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:02:05.724047 kubelet[2569]: E0213 08:02:05.723906 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:02:05.724047 kubelet[2569]: E0213 08:02:05.723930 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:05.724047 kubelet[2569]: E0213 08:02:05.723949 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:02:10.707177 env[1458]: time="2024-02-13T08:02:10.707040683Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:02:10.756318 env[1458]: time="2024-02-13T08:02:10.756252092Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:10.756563 kubelet[2569]: E0213 08:02:10.756539 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:02:10.756947 kubelet[2569]: E0213 08:02:10.756591 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:02:10.756947 kubelet[2569]: E0213 08:02:10.756655 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:10.756947 kubelet[2569]: E0213 08:02:10.756699 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:02:13.707127 env[1458]: time="2024-02-13T08:02:13.706997654Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:02:13.761872 env[1458]: time="2024-02-13T08:02:13.761765696Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:13.762172 kubelet[2569]: E0213 08:02:13.762113 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:02:13.762172 kubelet[2569]: E0213 08:02:13.762172 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:02:13.762625 kubelet[2569]: E0213 08:02:13.762232 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:13.762625 kubelet[2569]: E0213 08:02:13.762283 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:02:16.705647 env[1458]: time="2024-02-13T08:02:16.705606394Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:02:16.721814 env[1458]: time="2024-02-13T08:02:16.721748351Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:16.721918 kubelet[2569]: E0213 08:02:16.721905 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:02:16.722066 kubelet[2569]: E0213 08:02:16.721931 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:02:16.722066 kubelet[2569]: E0213 08:02:16.721952 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:16.722066 kubelet[2569]: E0213 08:02:16.721969 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:02:20.706324 env[1458]: time="2024-02-13T08:02:20.706235643Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:02:20.733201 env[1458]: time="2024-02-13T08:02:20.733166564Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:20.733426 kubelet[2569]: E0213 08:02:20.733389 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:02:20.733426 kubelet[2569]: E0213 08:02:20.733414 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:02:20.733592 kubelet[2569]: E0213 08:02:20.733436 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:20.733592 kubelet[2569]: E0213 08:02:20.733454 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:02:25.706096 env[1458]: time="2024-02-13T08:02:25.705957342Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:02:25.732249 env[1458]: time="2024-02-13T08:02:25.732182848Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:25.732415 kubelet[2569]: E0213 08:02:25.732373 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:02:25.732415 kubelet[2569]: E0213 08:02:25.732399 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:02:25.732622 kubelet[2569]: E0213 08:02:25.732423 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:25.732622 kubelet[2569]: E0213 08:02:25.732443 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:02:27.706674 env[1458]: time="2024-02-13T08:02:27.706533974Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:02:27.732912 env[1458]: time="2024-02-13T08:02:27.732849157Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:27.733077 kubelet[2569]: E0213 08:02:27.733028 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:02:27.733077 kubelet[2569]: E0213 08:02:27.733056 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:02:27.733077 kubelet[2569]: E0213 08:02:27.733078 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:27.733301 kubelet[2569]: E0213 08:02:27.733096 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:02:30.706316 env[1458]: time="2024-02-13T08:02:30.706207308Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:02:30.733260 env[1458]: time="2024-02-13T08:02:30.733225430Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:30.733419 kubelet[2569]: E0213 08:02:30.733402 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:02:30.733614 kubelet[2569]: E0213 08:02:30.733435 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:02:30.733614 kubelet[2569]: E0213 08:02:30.733458 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:30.733614 kubelet[2569]: E0213 08:02:30.733476 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:02:32.706266 env[1458]: time="2024-02-13T08:02:32.706172992Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:02:32.761813 env[1458]: time="2024-02-13T08:02:32.761740212Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:32.762111 kubelet[2569]: E0213 08:02:32.762047 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:02:32.762111 kubelet[2569]: E0213 08:02:32.762098 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:02:32.762566 kubelet[2569]: E0213 08:02:32.762151 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:32.762566 kubelet[2569]: E0213 08:02:32.762192 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:02:40.706130 env[1458]: time="2024-02-13T08:02:40.706035561Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:02:40.706130 env[1458]: time="2024-02-13T08:02:40.706066575Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:02:40.765894 env[1458]: time="2024-02-13T08:02:40.765713630Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:40.766277 env[1458]: time="2024-02-13T08:02:40.766094927Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:40.766479 kubelet[2569]: E0213 08:02:40.766196 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:02:40.766479 kubelet[2569]: E0213 08:02:40.766278 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:02:40.766479 kubelet[2569]: E0213 08:02:40.766384 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:40.766479 kubelet[2569]: E0213 08:02:40.766461 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:02:40.767795 kubelet[2569]: E0213 08:02:40.766601 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:02:40.767795 kubelet[2569]: E0213 08:02:40.766703 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:02:40.767795 kubelet[2569]: E0213 08:02:40.766815 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:40.767795 kubelet[2569]: E0213 08:02:40.766894 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:02:42.706676 env[1458]: time="2024-02-13T08:02:42.706552304Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:02:42.733616 env[1458]: time="2024-02-13T08:02:42.733580779Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:42.733784 kubelet[2569]: E0213 08:02:42.733742 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:02:42.733784 kubelet[2569]: E0213 08:02:42.733768 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:02:42.733971 kubelet[2569]: E0213 08:02:42.733790 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:42.733971 kubelet[2569]: E0213 08:02:42.733807 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:02:46.113000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.157045 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:02:46.157175 kernel: audit: type=1400 audit(1707811366.113:1247): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.113000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001ff8d80 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:02:46.368785 kernel: audit: type=1300 audit(1707811366.113:1247): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001ff8d80 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:02:46.368830 kernel: audit: type=1400 audit(1707811366.113:1248): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.113000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.113000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:02:46.552255 kernel: audit: type=1327 audit(1707811366.113:1247): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:02:46.552296 kernel: audit: type=1300 audit(1707811366.113:1248): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006a5b60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:02:46.113000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006a5b60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:02:46.672561 kernel: audit: type=1327 audit(1707811366.113:1248): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:02:46.113000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:02:46.765759 kernel: audit: type=1400 audit(1707811366.184:1249): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.184000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.856373 kernel: audit: type=1300 audit(1707811366.184:1249): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c013fd14d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:02:46.184000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c013fd14d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:02:46.954544 kernel: audit: type=1327 audit(1707811366.184:1249): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:02:46.184000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:02:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:47.138348 kernel: audit: type=1400 audit(1707811366.185:1250): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c013fd1530 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:02:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:02:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e7512e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:02:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:02:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c013fd1c20 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:02:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:02:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e5fa200 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:02:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:02:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c013fd1d70 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:02:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:02:47.706588 env[1458]: time="2024-02-13T08:02:47.706503074Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:02:47.731746 env[1458]: time="2024-02-13T08:02:47.731626018Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:47.731963 kubelet[2569]: E0213 08:02:47.731950 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:02:47.732147 kubelet[2569]: E0213 08:02:47.731982 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:02:47.732147 kubelet[2569]: E0213 08:02:47.732015 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:47.732147 kubelet[2569]: E0213 08:02:47.732040 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:02:50.864000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:50.864000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b22560 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:02:50.864000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:02:50.868000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:50.868000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00015f460 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:02:50.868000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:02:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0007abd00 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:02:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:02:50.873000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:02:50.873000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00015f5a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:02:50.873000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:02:52.707053 env[1458]: time="2024-02-13T08:02:52.706957270Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:02:52.758802 env[1458]: time="2024-02-13T08:02:52.758693314Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:52.759064 kubelet[2569]: E0213 08:02:52.759008 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:02:52.759064 kubelet[2569]: E0213 08:02:52.759057 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:02:52.759543 kubelet[2569]: E0213 08:02:52.759114 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:52.759543 kubelet[2569]: E0213 08:02:52.759156 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:02:54.707087 env[1458]: time="2024-02-13T08:02:54.706995422Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:02:54.736388 env[1458]: time="2024-02-13T08:02:54.736279268Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:54.736532 kubelet[2569]: E0213 08:02:54.736517 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:02:54.736740 kubelet[2569]: E0213 08:02:54.736544 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:02:54.736740 kubelet[2569]: E0213 08:02:54.736568 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:54.736740 kubelet[2569]: E0213 08:02:54.736585 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:02:56.706139 env[1458]: time="2024-02-13T08:02:56.706042647Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:02:56.722440 env[1458]: time="2024-02-13T08:02:56.722407837Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:02:56.722596 kubelet[2569]: E0213 08:02:56.722586 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:02:56.722768 kubelet[2569]: E0213 08:02:56.722613 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:02:56.722768 kubelet[2569]: E0213 08:02:56.722643 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:02:56.722768 kubelet[2569]: E0213 08:02:56.722663 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:03:01.707164 env[1458]: time="2024-02-13T08:03:01.707068797Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:03:01.733865 env[1458]: time="2024-02-13T08:03:01.733798198Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:01.734005 kubelet[2569]: E0213 08:03:01.733994 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:03:01.734171 kubelet[2569]: E0213 08:03:01.734020 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:03:01.734171 kubelet[2569]: E0213 08:03:01.734042 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:01.734171 kubelet[2569]: E0213 08:03:01.734059 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:03:06.706506 env[1458]: time="2024-02-13T08:03:06.706409618Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:03:06.732516 env[1458]: time="2024-02-13T08:03:06.732478437Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:06.749228 kubelet[2569]: E0213 08:03:06.732686 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:03:06.749228 kubelet[2569]: E0213 08:03:06.732726 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:03:06.749228 kubelet[2569]: E0213 08:03:06.732747 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:06.749228 kubelet[2569]: E0213 08:03:06.732767 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:03:07.706501 env[1458]: time="2024-02-13T08:03:07.706396223Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:03:07.733041 env[1458]: time="2024-02-13T08:03:07.733006622Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:07.733316 kubelet[2569]: E0213 08:03:07.733228 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:03:07.733316 kubelet[2569]: E0213 08:03:07.733286 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:03:07.733316 kubelet[2569]: E0213 08:03:07.733307 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:07.733422 kubelet[2569]: E0213 08:03:07.733325 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:03:10.706988 env[1458]: time="2024-02-13T08:03:10.706859842Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:03:10.761397 env[1458]: time="2024-02-13T08:03:10.761293625Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:10.761593 kubelet[2569]: E0213 08:03:10.761571 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:03:10.761997 kubelet[2569]: E0213 08:03:10.761621 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:03:10.761997 kubelet[2569]: E0213 08:03:10.761686 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:10.761997 kubelet[2569]: E0213 08:03:10.761728 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:03:12.694954 systemd[1]: Started sshd@8-145.40.90.207:22-141.98.11.90:17718.service. Feb 13 08:03:12.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.90.207:22-141.98.11.90:17718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:03:12.735356 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:03:12.735415 kernel: audit: type=1130 audit(1707811392.694:1259): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.90.207:22-141.98.11.90:17718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:03:14.172395 sshd[7459]: Invalid user admin from 141.98.11.90 port 17718 Feb 13 08:03:14.459236 sshd[7459]: pam_faillock(sshd:auth): User unknown Feb 13 08:03:14.460286 sshd[7459]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:03:14.460378 sshd[7459]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.90 Feb 13 08:03:14.461325 sshd[7459]: pam_faillock(sshd:auth): User unknown Feb 13 08:03:14.460000 audit[7459]: USER_AUTH pid=7459 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="admin" exe="/usr/sbin/sshd" hostname=141.98.11.90 addr=141.98.11.90 terminal=ssh res=failed' Feb 13 08:03:14.553700 kernel: audit: type=1100 audit(1707811394.460:1260): pid=7459 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="admin" exe="/usr/sbin/sshd" hostname=141.98.11.90 addr=141.98.11.90 terminal=ssh res=failed' Feb 13 08:03:14.705314 env[1458]: time="2024-02-13T08:03:14.705236287Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:03:14.718581 env[1458]: time="2024-02-13T08:03:14.718487196Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:14.718738 kubelet[2569]: E0213 08:03:14.718723 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:03:14.718923 kubelet[2569]: E0213 08:03:14.718762 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:03:14.718923 kubelet[2569]: E0213 08:03:14.718799 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:14.718923 kubelet[2569]: E0213 08:03:14.718826 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:03:16.132057 sshd[7459]: Failed password for invalid user admin from 141.98.11.90 port 17718 ssh2 Feb 13 08:03:17.017023 sshd[7459]: Connection closed by invalid user admin 141.98.11.90 port 17718 [preauth] Feb 13 08:03:17.019597 systemd[1]: sshd@8-145.40.90.207:22-141.98.11.90:17718.service: Deactivated successfully. Feb 13 08:03:17.019000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.90.207:22-141.98.11.90:17718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:03:17.110808 kernel: audit: type=1131 audit(1707811397.019:1261): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.90.207:22-141.98.11.90:17718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:03:19.706924 env[1458]: time="2024-02-13T08:03:19.706801142Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:03:19.733626 env[1458]: time="2024-02-13T08:03:19.733565393Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:19.733810 kubelet[2569]: E0213 08:03:19.733767 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:03:19.733810 kubelet[2569]: E0213 08:03:19.733805 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:03:19.733971 kubelet[2569]: E0213 08:03:19.733827 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:19.733971 kubelet[2569]: E0213 08:03:19.733845 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:03:21.706865 env[1458]: time="2024-02-13T08:03:21.706741503Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:03:21.734526 env[1458]: time="2024-02-13T08:03:21.734463821Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:21.734757 kubelet[2569]: E0213 08:03:21.734712 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:03:21.734757 kubelet[2569]: E0213 08:03:21.734739 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:03:21.734933 kubelet[2569]: E0213 08:03:21.734760 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:21.734933 kubelet[2569]: E0213 08:03:21.734780 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:03:22.706416 env[1458]: time="2024-02-13T08:03:22.706284809Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:03:22.732529 env[1458]: time="2024-02-13T08:03:22.732491683Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:22.732762 kubelet[2569]: E0213 08:03:22.732665 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:03:22.732762 kubelet[2569]: E0213 08:03:22.732692 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:03:22.732762 kubelet[2569]: E0213 08:03:22.732715 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:22.732762 kubelet[2569]: E0213 08:03:22.732733 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:03:27.707251 env[1458]: time="2024-02-13T08:03:27.707096521Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:03:27.736547 env[1458]: time="2024-02-13T08:03:27.736491596Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:27.736807 kubelet[2569]: E0213 08:03:27.736767 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:03:27.736807 kubelet[2569]: E0213 08:03:27.736792 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:03:27.736977 kubelet[2569]: E0213 08:03:27.736813 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:27.736977 kubelet[2569]: E0213 08:03:27.736831 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:03:30.707149 env[1458]: time="2024-02-13T08:03:30.707014350Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:03:30.733676 env[1458]: time="2024-02-13T08:03:30.733584666Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:30.733797 kubelet[2569]: E0213 08:03:30.733777 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:03:30.733945 kubelet[2569]: E0213 08:03:30.733806 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:03:30.733945 kubelet[2569]: E0213 08:03:30.733830 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:30.733945 kubelet[2569]: E0213 08:03:30.733847 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:03:35.707067 env[1458]: time="2024-02-13T08:03:35.706942945Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:03:35.707067 env[1458]: time="2024-02-13T08:03:35.706945527Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:03:35.722713 env[1458]: time="2024-02-13T08:03:35.722643082Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:35.722905 kubelet[2569]: E0213 08:03:35.722861 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:03:35.722905 kubelet[2569]: E0213 08:03:35.722891 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:03:35.723113 kubelet[2569]: E0213 08:03:35.722916 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:35.723113 kubelet[2569]: E0213 08:03:35.722938 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:03:35.723210 env[1458]: time="2024-02-13T08:03:35.723057180Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:35.723237 kubelet[2569]: E0213 08:03:35.723142 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:03:35.723237 kubelet[2569]: E0213 08:03:35.723158 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:03:35.723237 kubelet[2569]: E0213 08:03:35.723180 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:35.723237 kubelet[2569]: E0213 08:03:35.723196 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:03:40.705291 env[1458]: time="2024-02-13T08:03:40.705231215Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:03:40.724870 env[1458]: time="2024-02-13T08:03:40.724830502Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:40.725102 kubelet[2569]: E0213 08:03:40.725059 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:03:40.725102 kubelet[2569]: E0213 08:03:40.725088 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:03:40.725304 kubelet[2569]: E0213 08:03:40.725114 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:40.725304 kubelet[2569]: E0213 08:03:40.725134 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:03:45.707015 env[1458]: time="2024-02-13T08:03:45.706925868Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:03:45.759038 env[1458]: time="2024-02-13T08:03:45.758946772Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:45.759276 kubelet[2569]: E0213 08:03:45.759226 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:03:45.759276 kubelet[2569]: E0213 08:03:45.759270 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:03:45.759657 kubelet[2569]: E0213 08:03:45.759317 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:45.759657 kubelet[2569]: E0213 08:03:45.759351 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:03:46.114000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.114000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0015a5fe0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:03:46.327557 kernel: audit: type=1400 audit(1707811426.114:1262): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.327654 kernel: audit: type=1300 audit(1707811426.114:1262): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0015a5fe0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:03:46.327674 kernel: audit: type=1327 audit(1707811426.114:1262): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:03:46.114000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:03:46.421012 kernel: audit: type=1400 audit(1707811426.115:1263): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.115000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.512343 kernel: audit: type=1300 audit(1707811426.115:1263): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000ddc0c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:03:46.115000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000ddc0c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:03:46.115000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:03:46.726093 kernel: audit: type=1327 audit(1707811426.115:1263): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:03:46.726132 kernel: audit: type=1400 audit(1707811426.185:1264): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.816174 kernel: audit: type=1300 audit(1707811426.185:1264): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009bfaa60 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:03:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009bfaa60 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:03:46.914704 kernel: audit: type=1327 audit(1707811426.185:1264): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:03:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:03:47.008025 kernel: audit: type=1400 audit(1707811426.185:1265): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00f855860 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:03:46.185000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:03:46.185000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c009b9e630 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:03:46.185000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:03:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00fed84e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:03:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:03:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e73ea80 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:03:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:03:46.927000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:46.927000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00a5535e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:03:46.927000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:03:50.706504 env[1458]: time="2024-02-13T08:03:50.706379545Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:03:50.759404 env[1458]: time="2024-02-13T08:03:50.759308604Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:50.759710 kubelet[2569]: E0213 08:03:50.759628 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:03:50.759710 kubelet[2569]: E0213 08:03:50.759699 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:03:50.760206 kubelet[2569]: E0213 08:03:50.759755 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:50.760206 kubelet[2569]: E0213 08:03:50.759799 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:03:50.864000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:50.864000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001b22800 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:03:50.864000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:03:50.869000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:50.869000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001000060 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:03:50.869000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:03:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000fff3a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:03:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:03:50.874000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:03:50.874000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001b22a00 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:03:50.874000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:03:51.707131 env[1458]: time="2024-02-13T08:03:51.706988759Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:03:51.707131 env[1458]: time="2024-02-13T08:03:51.707063572Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:03:51.758360 env[1458]: time="2024-02-13T08:03:51.758297727Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:51.758556 env[1458]: time="2024-02-13T08:03:51.758509550Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:03:51.758621 kubelet[2569]: E0213 08:03:51.758559 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:03:51.758621 kubelet[2569]: E0213 08:03:51.758603 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:03:51.758775 kubelet[2569]: E0213 08:03:51.758663 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:51.758775 kubelet[2569]: E0213 08:03:51.758692 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:03:51.758775 kubelet[2569]: E0213 08:03:51.758703 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:03:51.758775 kubelet[2569]: E0213 08:03:51.758726 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:03:51.759042 kubelet[2569]: E0213 08:03:51.758768 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:03:51.759042 kubelet[2569]: E0213 08:03:51.758801 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:04:00.706549 env[1458]: time="2024-02-13T08:04:00.706396565Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:04:00.733121 env[1458]: time="2024-02-13T08:04:00.733085753Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:00.733283 kubelet[2569]: E0213 08:04:00.733271 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:04:00.733459 kubelet[2569]: E0213 08:04:00.733299 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:04:00.733459 kubelet[2569]: E0213 08:04:00.733322 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:00.733459 kubelet[2569]: E0213 08:04:00.733342 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:04:04.706549 env[1458]: time="2024-02-13T08:04:04.706425197Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:04:04.707378 env[1458]: time="2024-02-13T08:04:04.706538388Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:04:04.733581 env[1458]: time="2024-02-13T08:04:04.733546481Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:04.733734 env[1458]: time="2024-02-13T08:04:04.733546468Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:04.733835 kubelet[2569]: E0213 08:04:04.733773 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:04:04.733835 kubelet[2569]: E0213 08:04:04.733831 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:04:04.734023 kubelet[2569]: E0213 08:04:04.733773 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:04:04.734023 kubelet[2569]: E0213 08:04:04.733852 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:04.734023 kubelet[2569]: E0213 08:04:04.733858 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:04:04.734023 kubelet[2569]: E0213 08:04:04.733870 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:04:04.734137 kubelet[2569]: E0213 08:04:04.733879 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:04.734137 kubelet[2569]: E0213 08:04:04.733893 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:04:07.706855 env[1458]: time="2024-02-13T08:04:07.706754622Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:04:07.733318 env[1458]: time="2024-02-13T08:04:07.733242923Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:07.733522 kubelet[2569]: E0213 08:04:07.733466 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:04:07.733730 kubelet[2569]: E0213 08:04:07.733524 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:04:07.733730 kubelet[2569]: E0213 08:04:07.733547 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:07.733730 kubelet[2569]: E0213 08:04:07.733567 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:04:13.706688 env[1458]: time="2024-02-13T08:04:13.706568008Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:04:13.721808 env[1458]: time="2024-02-13T08:04:13.721743943Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:13.721974 kubelet[2569]: E0213 08:04:13.721960 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:04:13.722179 kubelet[2569]: E0213 08:04:13.721992 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:04:13.722179 kubelet[2569]: E0213 08:04:13.722027 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:13.722179 kubelet[2569]: E0213 08:04:13.722058 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:04:16.706022 env[1458]: time="2024-02-13T08:04:16.705837863Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:04:16.754859 env[1458]: time="2024-02-13T08:04:16.754796419Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:16.755092 kubelet[2569]: E0213 08:04:16.755073 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:04:16.755387 kubelet[2569]: E0213 08:04:16.755113 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:04:16.755387 kubelet[2569]: E0213 08:04:16.755155 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:16.755387 kubelet[2569]: E0213 08:04:16.755194 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:04:17.706306 env[1458]: time="2024-02-13T08:04:17.706159588Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:04:17.733452 env[1458]: time="2024-02-13T08:04:17.733418818Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:17.733700 kubelet[2569]: E0213 08:04:17.733648 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:04:17.733700 kubelet[2569]: E0213 08:04:17.733698 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:04:17.733769 kubelet[2569]: E0213 08:04:17.733717 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:17.733769 kubelet[2569]: E0213 08:04:17.733734 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:04:19.707283 env[1458]: time="2024-02-13T08:04:19.707201776Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:04:19.724892 env[1458]: time="2024-02-13T08:04:19.724825901Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:19.725016 kubelet[2569]: E0213 08:04:19.724984 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:04:19.725016 kubelet[2569]: E0213 08:04:19.725013 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:04:19.725212 kubelet[2569]: E0213 08:04:19.725037 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:19.725212 kubelet[2569]: E0213 08:04:19.725056 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:04:27.706085 env[1458]: time="2024-02-13T08:04:27.705990517Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:04:27.732982 env[1458]: time="2024-02-13T08:04:27.732946918Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:27.733177 kubelet[2569]: E0213 08:04:27.733138 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:04:27.733177 kubelet[2569]: E0213 08:04:27.733163 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:04:27.733362 kubelet[2569]: E0213 08:04:27.733185 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:27.733362 kubelet[2569]: E0213 08:04:27.733202 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:04:28.706940 env[1458]: time="2024-02-13T08:04:28.706838807Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:04:28.706940 env[1458]: time="2024-02-13T08:04:28.706838799Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:04:28.733901 env[1458]: time="2024-02-13T08:04:28.733836653Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:28.733901 env[1458]: time="2024-02-13T08:04:28.733846723Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:28.734097 kubelet[2569]: E0213 08:04:28.734034 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:04:28.734097 kubelet[2569]: E0213 08:04:28.734060 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:04:28.734097 kubelet[2569]: E0213 08:04:28.734082 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:28.734097 kubelet[2569]: E0213 08:04:28.734100 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:04:28.734360 kubelet[2569]: E0213 08:04:28.734034 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:04:28.734360 kubelet[2569]: E0213 08:04:28.734121 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:04:28.734360 kubelet[2569]: E0213 08:04:28.734141 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:28.734360 kubelet[2569]: E0213 08:04:28.734156 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:04:32.706262 env[1458]: time="2024-02-13T08:04:32.706159216Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:04:32.732153 env[1458]: time="2024-02-13T08:04:32.732086911Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:32.732275 kubelet[2569]: E0213 08:04:32.732260 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:04:32.732464 kubelet[2569]: E0213 08:04:32.732306 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:04:32.732464 kubelet[2569]: E0213 08:04:32.732339 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:32.732464 kubelet[2569]: E0213 08:04:32.732364 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:04:40.706545 env[1458]: time="2024-02-13T08:04:40.706435905Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:04:40.732811 env[1458]: time="2024-02-13T08:04:40.732716793Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:40.732996 kubelet[2569]: E0213 08:04:40.732984 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:04:40.733180 kubelet[2569]: E0213 08:04:40.733013 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:04:40.733180 kubelet[2569]: E0213 08:04:40.733045 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:40.733180 kubelet[2569]: E0213 08:04:40.733070 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:04:41.706498 env[1458]: time="2024-02-13T08:04:41.706410152Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:04:41.754712 env[1458]: time="2024-02-13T08:04:41.754605585Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:41.755130 kubelet[2569]: E0213 08:04:41.754905 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:04:41.755130 kubelet[2569]: E0213 08:04:41.754953 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:04:41.755130 kubelet[2569]: E0213 08:04:41.755003 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:41.755130 kubelet[2569]: E0213 08:04:41.755048 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:04:42.706268 env[1458]: time="2024-02-13T08:04:42.706176263Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:04:42.761277 env[1458]: time="2024-02-13T08:04:42.761185098Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:42.762099 kubelet[2569]: E0213 08:04:42.761596 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:04:42.762099 kubelet[2569]: E0213 08:04:42.761687 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:04:42.762099 kubelet[2569]: E0213 08:04:42.761767 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:42.762099 kubelet[2569]: E0213 08:04:42.761827 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:04:43.706719 env[1458]: time="2024-02-13T08:04:43.706615969Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:04:43.760697 env[1458]: time="2024-02-13T08:04:43.760611962Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:43.760987 kubelet[2569]: E0213 08:04:43.760929 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:04:43.760987 kubelet[2569]: E0213 08:04:43.760980 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:04:43.761202 kubelet[2569]: E0213 08:04:43.761034 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:43.761202 kubelet[2569]: E0213 08:04:43.761081 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:04:46.116000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.159388 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:04:46.159482 kernel: audit: type=1400 audit(1707811486.116:1274): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.116000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001078cc0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:04:46.372305 kernel: audit: type=1300 audit(1707811486.116:1274): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001078cc0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:04:46.372394 kernel: audit: type=1327 audit(1707811486.116:1274): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:04:46.116000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:04:46.465600 kernel: audit: type=1400 audit(1707811486.116:1275): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.116000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.555793 kernel: audit: type=1300 audit(1707811486.116:1275): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000fff540 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:04:46.116000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000fff540 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:04:46.116000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:04:46.769415 kernel: audit: type=1327 audit(1707811486.116:1275): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:04:46.769501 kernel: audit: type=1400 audit(1707811486.187:1276): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.187000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.860086 kernel: audit: type=1300 audit(1707811486.187:1276): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009b9e480 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:04:46.187000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009b9e480 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:04:46.958506 kernel: audit: type=1327 audit(1707811486.187:1276): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:04:46.187000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:04:47.051725 kernel: audit: type=1400 audit(1707811486.187:1277): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.187000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.187000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009dcf980 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:04:46.187000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:04:46.187000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.187000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c01164a240 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:04:46.187000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:04:46.930000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.930000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.930000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00971c8a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:04:46.930000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e7c7400 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:04:46.930000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:04:46.930000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:04:46.930000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:46.930000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0099143c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:04:46.930000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:04:50.866000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:50.866000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000db4040 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:04:50.866000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:04:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114e2a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:04:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:04:50.872000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:50.872000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000db4080 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:04:50.872000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:04:50.875000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:04:50.875000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114e5e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:04:50.875000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:04:53.706254 env[1458]: time="2024-02-13T08:04:53.706113275Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:04:53.707231 env[1458]: time="2024-02-13T08:04:53.706113266Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:04:53.733195 env[1458]: time="2024-02-13T08:04:53.733155366Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:53.733325 env[1458]: time="2024-02-13T08:04:53.733259830Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:53.733416 kubelet[2569]: E0213 08:04:53.733406 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:04:53.733607 kubelet[2569]: E0213 08:04:53.733434 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:04:53.733607 kubelet[2569]: E0213 08:04:53.733457 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:53.733607 kubelet[2569]: E0213 08:04:53.733474 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:04:53.733607 kubelet[2569]: E0213 08:04:53.733406 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:04:53.733607 kubelet[2569]: E0213 08:04:53.733490 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:04:53.733786 kubelet[2569]: E0213 08:04:53.733509 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:53.733786 kubelet[2569]: E0213 08:04:53.733523 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:04:55.706876 env[1458]: time="2024-02-13T08:04:55.706708754Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:04:55.707680 env[1458]: time="2024-02-13T08:04:55.706921612Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:04:55.733799 env[1458]: time="2024-02-13T08:04:55.733727670Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:55.734021 env[1458]: time="2024-02-13T08:04:55.733807256Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:04:55.734064 kubelet[2569]: E0213 08:04:55.733993 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:04:55.734064 kubelet[2569]: E0213 08:04:55.733993 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:04:55.734064 kubelet[2569]: E0213 08:04:55.734031 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:04:55.734064 kubelet[2569]: E0213 08:04:55.734032 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:04:55.734064 kubelet[2569]: E0213 08:04:55.734052 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:55.734313 kubelet[2569]: E0213 08:04:55.734062 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:04:55.734313 kubelet[2569]: E0213 08:04:55.734068 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:04:55.734313 kubelet[2569]: E0213 08:04:55.734084 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:05:05.706235 env[1458]: time="2024-02-13T08:05:05.706108155Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:05:05.735840 env[1458]: time="2024-02-13T08:05:05.735766809Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:05.736080 kubelet[2569]: E0213 08:05:05.736027 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:05:05.736080 kubelet[2569]: E0213 08:05:05.736081 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:05:05.736275 kubelet[2569]: E0213 08:05:05.736103 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:05.736275 kubelet[2569]: E0213 08:05:05.736119 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:05:07.706778 env[1458]: time="2024-02-13T08:05:07.706676560Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:05:07.708051 env[1458]: time="2024-02-13T08:05:07.706948188Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:05:07.733271 env[1458]: time="2024-02-13T08:05:07.733208704Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:07.733271 env[1458]: time="2024-02-13T08:05:07.733223236Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:07.733419 kubelet[2569]: E0213 08:05:07.733388 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:05:07.733419 kubelet[2569]: E0213 08:05:07.733418 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:05:07.733600 kubelet[2569]: E0213 08:05:07.733439 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:07.733600 kubelet[2569]: E0213 08:05:07.733459 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:05:07.733600 kubelet[2569]: E0213 08:05:07.733389 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:05:07.733600 kubelet[2569]: E0213 08:05:07.733475 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:05:07.733733 kubelet[2569]: E0213 08:05:07.733495 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:07.733733 kubelet[2569]: E0213 08:05:07.733517 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:05:08.706581 env[1458]: time="2024-02-13T08:05:08.706489319Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:05:08.754179 env[1458]: time="2024-02-13T08:05:08.754096984Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:08.754535 kubelet[2569]: E0213 08:05:08.754327 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:05:08.754535 kubelet[2569]: E0213 08:05:08.754367 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:05:08.754535 kubelet[2569]: E0213 08:05:08.754413 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:08.754535 kubelet[2569]: E0213 08:05:08.754447 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:05:19.707331 env[1458]: time="2024-02-13T08:05:19.707199309Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:05:19.707331 env[1458]: time="2024-02-13T08:05:19.707228131Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:05:19.734001 env[1458]: time="2024-02-13T08:05:19.733943868Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:19.734001 env[1458]: time="2024-02-13T08:05:19.733957504Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:19.734214 kubelet[2569]: E0213 08:05:19.734104 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:05:19.734214 kubelet[2569]: E0213 08:05:19.734125 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:05:19.734214 kubelet[2569]: E0213 08:05:19.734141 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:05:19.734214 kubelet[2569]: E0213 08:05:19.734143 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:05:19.734214 kubelet[2569]: E0213 08:05:19.734163 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:19.734468 kubelet[2569]: E0213 08:05:19.734169 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:19.734468 kubelet[2569]: E0213 08:05:19.734179 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:05:19.734468 kubelet[2569]: E0213 08:05:19.734190 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:05:21.706869 env[1458]: time="2024-02-13T08:05:21.706783610Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:05:21.707695 env[1458]: time="2024-02-13T08:05:21.707014338Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:05:21.733232 env[1458]: time="2024-02-13T08:05:21.733193829Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:21.733404 env[1458]: time="2024-02-13T08:05:21.733242249Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:21.733459 kubelet[2569]: E0213 08:05:21.733447 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:05:21.733612 kubelet[2569]: E0213 08:05:21.733447 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:05:21.733612 kubelet[2569]: E0213 08:05:21.733477 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:05:21.733612 kubelet[2569]: E0213 08:05:21.733482 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:05:21.733612 kubelet[2569]: E0213 08:05:21.733498 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:21.733612 kubelet[2569]: E0213 08:05:21.733502 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:21.733795 kubelet[2569]: E0213 08:05:21.733516 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:05:21.733795 kubelet[2569]: E0213 08:05:21.733518 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:05:31.706555 env[1458]: time="2024-02-13T08:05:31.706448248Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:05:31.733763 env[1458]: time="2024-02-13T08:05:31.733701958Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:31.733916 kubelet[2569]: E0213 08:05:31.733904 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:05:31.734117 kubelet[2569]: E0213 08:05:31.733932 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:05:31.734117 kubelet[2569]: E0213 08:05:31.733964 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:31.734117 kubelet[2569]: E0213 08:05:31.733989 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:05:34.706579 env[1458]: time="2024-02-13T08:05:34.706485114Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:05:34.707468 env[1458]: time="2024-02-13T08:05:34.706792996Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:05:34.707468 env[1458]: time="2024-02-13T08:05:34.706920160Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:05:34.738056 env[1458]: time="2024-02-13T08:05:34.737991858Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:34.738189 kubelet[2569]: E0213 08:05:34.738181 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:05:34.738350 kubelet[2569]: E0213 08:05:34.738208 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:05:34.738350 kubelet[2569]: E0213 08:05:34.738231 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:34.738350 kubelet[2569]: E0213 08:05:34.738248 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:05:34.739377 env[1458]: time="2024-02-13T08:05:34.739318657Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:34.739424 env[1458]: time="2024-02-13T08:05:34.739389027Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:34.739454 kubelet[2569]: E0213 08:05:34.739415 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:05:34.739454 kubelet[2569]: E0213 08:05:34.739433 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:05:34.739502 kubelet[2569]: E0213 08:05:34.739455 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:34.739502 kubelet[2569]: E0213 08:05:34.739472 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:05:34.739502 kubelet[2569]: E0213 08:05:34.739488 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:05:34.739598 kubelet[2569]: E0213 08:05:34.739503 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:05:34.739598 kubelet[2569]: E0213 08:05:34.739523 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:34.739598 kubelet[2569]: E0213 08:05:34.739537 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:05:43.706607 env[1458]: time="2024-02-13T08:05:43.706469851Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:05:43.735755 env[1458]: time="2024-02-13T08:05:43.735719870Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:43.735897 kubelet[2569]: E0213 08:05:43.735886 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:05:43.736063 kubelet[2569]: E0213 08:05:43.735911 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:05:43.736063 kubelet[2569]: E0213 08:05:43.735932 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:43.736063 kubelet[2569]: E0213 08:05:43.735950 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:05:46.116000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.144803 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:05:46.144888 kernel: audit: type=1400 audit(1707811546.116:1286): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.116000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002721980 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:05:46.354595 kernel: audit: type=1300 audit(1707811546.116:1286): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002721980 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:05:46.354702 kernel: audit: type=1327 audit(1707811546.116:1286): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:05:46.116000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:05:46.116000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.536511 kernel: audit: type=1400 audit(1707811546.116:1287): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.536544 kernel: audit: type=1300 audit(1707811546.116:1287): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b23340 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:05:46.116000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b23340 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:05:46.116000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:05:46.750374 kernel: audit: type=1327 audit(1707811546.116:1287): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:05:46.750458 kernel: audit: type=1400 audit(1707811546.187:1289): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.187000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.841091 kernel: audit: type=1400 audit(1707811546.187:1288): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.187000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.931376 kernel: audit: type=1300 audit(1707811546.187:1288): arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00976f840 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:05:46.187000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00976f840 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:05:47.029827 kernel: audit: type=1300 audit(1707811546.187:1289): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00b5a72c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:05:46.187000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00b5a72c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:05:46.187000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:05:46.187000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:05:46.187000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.187000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0072ee000 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:05:46.187000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:05:46.929000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.929000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009a50c90 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:05:46.929000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:05:46.929000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.929000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c009e4a8d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:05:46.929000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:05:46.929000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:46.929000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00976f860 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:05:46.929000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:05:47.707183 env[1458]: time="2024-02-13T08:05:47.707094040Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:05:47.708194 env[1458]: time="2024-02-13T08:05:47.707323621Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:05:47.732562 env[1458]: time="2024-02-13T08:05:47.732522598Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:47.732752 env[1458]: time="2024-02-13T08:05:47.732727039Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:47.732789 kubelet[2569]: E0213 08:05:47.732778 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:05:47.732950 kubelet[2569]: E0213 08:05:47.732805 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:05:47.732950 kubelet[2569]: E0213 08:05:47.732832 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:47.732950 kubelet[2569]: E0213 08:05:47.732841 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:05:47.732950 kubelet[2569]: E0213 08:05:47.732851 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:05:47.732950 kubelet[2569]: E0213 08:05:47.732856 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:05:47.733090 kubelet[2569]: E0213 08:05:47.732876 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:47.733090 kubelet[2569]: E0213 08:05:47.732889 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:05:49.705576 env[1458]: time="2024-02-13T08:05:49.705548972Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:05:49.718071 env[1458]: time="2024-02-13T08:05:49.717975763Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:49.718274 kubelet[2569]: E0213 08:05:49.718253 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:05:49.718517 kubelet[2569]: E0213 08:05:49.718301 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:05:49.718907 kubelet[2569]: E0213 08:05:49.718862 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:49.718907 kubelet[2569]: E0213 08:05:49.718896 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:05:50.865000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:50.865000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000c25f20 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:05:50.865000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:05:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000c25f60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:05:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:05:50.871000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:50.871000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001d0af20 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:05:50.871000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:05:50.876000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:05:50.876000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000bc2780 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:05:50.876000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:05:56.705843 env[1458]: time="2024-02-13T08:05:56.705770634Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:05:56.722488 env[1458]: time="2024-02-13T08:05:56.722413259Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:56.722595 kubelet[2569]: E0213 08:05:56.722559 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:05:56.722595 kubelet[2569]: E0213 08:05:56.722583 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:05:56.722788 kubelet[2569]: E0213 08:05:56.722605 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:56.722788 kubelet[2569]: E0213 08:05:56.722622 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:05:58.706089 env[1458]: time="2024-02-13T08:05:58.705969840Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:05:58.732570 env[1458]: time="2024-02-13T08:05:58.732512190Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:05:58.732727 kubelet[2569]: E0213 08:05:58.732714 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:05:58.732921 kubelet[2569]: E0213 08:05:58.732744 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:05:58.732921 kubelet[2569]: E0213 08:05:58.732774 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:05:58.732921 kubelet[2569]: E0213 08:05:58.732800 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:06:01.707148 env[1458]: time="2024-02-13T08:06:01.707018580Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:06:01.734091 env[1458]: time="2024-02-13T08:06:01.734007209Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:01.734286 kubelet[2569]: E0213 08:06:01.734240 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:06:01.734286 kubelet[2569]: E0213 08:06:01.734267 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:06:01.734286 kubelet[2569]: E0213 08:06:01.734288 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:01.734517 kubelet[2569]: E0213 08:06:01.734305 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:06:04.706943 env[1458]: time="2024-02-13T08:06:04.706808461Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:06:04.761299 env[1458]: time="2024-02-13T08:06:04.761208999Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:04.761544 kubelet[2569]: E0213 08:06:04.761519 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:06:04.761961 kubelet[2569]: E0213 08:06:04.761567 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:06:04.761961 kubelet[2569]: E0213 08:06:04.761620 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:04.761961 kubelet[2569]: E0213 08:06:04.761680 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:06:08.706426 env[1458]: time="2024-02-13T08:06:08.706343097Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:06:08.735350 env[1458]: time="2024-02-13T08:06:08.735271112Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:08.735529 kubelet[2569]: E0213 08:06:08.735518 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:06:08.735739 kubelet[2569]: E0213 08:06:08.735545 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:06:08.735739 kubelet[2569]: E0213 08:06:08.735568 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:08.735739 kubelet[2569]: E0213 08:06:08.735587 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:06:09.705189 env[1458]: time="2024-02-13T08:06:09.705136908Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:06:09.717491 env[1458]: time="2024-02-13T08:06:09.717455891Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:09.717730 kubelet[2569]: E0213 08:06:09.717621 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:06:09.717730 kubelet[2569]: E0213 08:06:09.717654 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:06:09.717730 kubelet[2569]: E0213 08:06:09.717679 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:09.717730 kubelet[2569]: E0213 08:06:09.717697 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:06:14.706714 env[1458]: time="2024-02-13T08:06:14.706592689Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:06:14.760521 env[1458]: time="2024-02-13T08:06:14.760455451Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:14.760787 kubelet[2569]: E0213 08:06:14.760727 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:06:14.760787 kubelet[2569]: E0213 08:06:14.760777 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:06:14.761286 kubelet[2569]: E0213 08:06:14.760831 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:14.761286 kubelet[2569]: E0213 08:06:14.760875 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:06:16.706684 env[1458]: time="2024-02-13T08:06:16.706585599Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:06:16.757414 env[1458]: time="2024-02-13T08:06:16.757346517Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:16.757652 kubelet[2569]: E0213 08:06:16.757618 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:06:16.757993 kubelet[2569]: E0213 08:06:16.757678 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:06:16.757993 kubelet[2569]: E0213 08:06:16.757725 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:16.757993 kubelet[2569]: E0213 08:06:16.757760 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:06:21.706942 env[1458]: time="2024-02-13T08:06:21.706800919Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:06:21.706942 env[1458]: time="2024-02-13T08:06:21.706800987Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:06:21.757991 env[1458]: time="2024-02-13T08:06:21.757927717Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:21.758158 env[1458]: time="2024-02-13T08:06:21.758079171Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:21.758273 kubelet[2569]: E0213 08:06:21.758221 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:06:21.758273 kubelet[2569]: E0213 08:06:21.758268 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:06:21.758665 kubelet[2569]: E0213 08:06:21.758316 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:21.758665 kubelet[2569]: E0213 08:06:21.758334 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:06:21.758665 kubelet[2569]: E0213 08:06:21.758353 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:06:21.758665 kubelet[2569]: E0213 08:06:21.758375 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:06:21.758944 kubelet[2569]: E0213 08:06:21.758423 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:21.758944 kubelet[2569]: E0213 08:06:21.758454 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:06:25.706798 env[1458]: time="2024-02-13T08:06:25.706707435Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:06:25.722469 env[1458]: time="2024-02-13T08:06:25.722402959Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:25.722602 kubelet[2569]: E0213 08:06:25.722591 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:06:25.722767 kubelet[2569]: E0213 08:06:25.722618 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:06:25.722767 kubelet[2569]: E0213 08:06:25.722649 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:25.722767 kubelet[2569]: E0213 08:06:25.722669 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:06:30.706545 env[1458]: time="2024-02-13T08:06:30.706406369Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:06:30.762952 env[1458]: time="2024-02-13T08:06:30.762888951Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:30.763099 kubelet[2569]: E0213 08:06:30.763055 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:06:30.763099 kubelet[2569]: E0213 08:06:30.763079 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:06:30.763099 kubelet[2569]: E0213 08:06:30.763100 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:30.763330 kubelet[2569]: E0213 08:06:30.763118 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:06:33.706990 env[1458]: time="2024-02-13T08:06:33.706899384Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:06:33.758272 env[1458]: time="2024-02-13T08:06:33.758175377Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:33.758482 kubelet[2569]: E0213 08:06:33.758458 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:06:33.758869 kubelet[2569]: E0213 08:06:33.758513 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:06:33.758869 kubelet[2569]: E0213 08:06:33.758581 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:33.758869 kubelet[2569]: E0213 08:06:33.758651 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:06:35.705476 env[1458]: time="2024-02-13T08:06:35.705453639Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:06:35.718118 env[1458]: time="2024-02-13T08:06:35.718082361Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:35.718320 kubelet[2569]: E0213 08:06:35.718279 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:06:35.718320 kubelet[2569]: E0213 08:06:35.718307 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:06:35.718525 kubelet[2569]: E0213 08:06:35.718337 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:35.718525 kubelet[2569]: E0213 08:06:35.718358 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:06:38.706030 env[1458]: time="2024-02-13T08:06:38.705901275Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:06:38.732361 env[1458]: time="2024-02-13T08:06:38.732293917Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:38.732541 kubelet[2569]: E0213 08:06:38.732531 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:06:38.732729 kubelet[2569]: E0213 08:06:38.732557 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:06:38.732729 kubelet[2569]: E0213 08:06:38.732579 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:38.732729 kubelet[2569]: E0213 08:06:38.732596 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:06:44.706976 env[1458]: time="2024-02-13T08:06:44.706880115Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:06:44.758282 env[1458]: time="2024-02-13T08:06:44.758215401Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:44.758521 kubelet[2569]: E0213 08:06:44.758480 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:06:44.758521 kubelet[2569]: E0213 08:06:44.758521 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:06:44.758921 kubelet[2569]: E0213 08:06:44.758565 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:44.758921 kubelet[2569]: E0213 08:06:44.758599 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:06:46.117000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.145944 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:06:46.146031 kernel: audit: type=1400 audit(1707811606.117:1298): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.117000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000ffeec0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:06:46.239671 kernel: audit: type=1300 audit(1707811606.117:1298): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000ffeec0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:06:46.117000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:06:46.455168 kernel: audit: type=1327 audit(1707811606.117:1298): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:06:46.455255 kernel: audit: type=1400 audit(1707811606.117:1299): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.117000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.546584 kernel: audit: type=1300 audit(1707811606.117:1299): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000ddc420 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:06:46.117000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000ddc420 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:06:46.117000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:06:46.762524 kernel: audit: type=1327 audit(1707811606.117:1299): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:06:46.762568 kernel: audit: type=1400 audit(1707811606.188:1300): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.854559 kernel: audit: type=1300 audit(1707811606.188:1300): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008cecde0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:06:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c008cecde0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:06:46.953970 kernel: audit: type=1327 audit(1707811606.188:1300): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:06:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:06:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:47.139652 kernel: audit: type=1400 audit(1707811606.188:1301): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0093d1c20 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:06:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:06:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e7f13e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:06:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:06:46.928000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.928000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e57e4a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:06:46.928000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:06:46.928000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.928000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c009de0de0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:06:46.928000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:06:46.928000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:46.928000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c011c5d5f0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:06:46.928000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:06:47.706375 env[1458]: time="2024-02-13T08:06:47.706289332Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:06:47.763865 env[1458]: time="2024-02-13T08:06:47.763750890Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:47.764305 kubelet[2569]: E0213 08:06:47.764222 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:06:47.764305 kubelet[2569]: E0213 08:06:47.764301 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:06:47.765144 kubelet[2569]: E0213 08:06:47.764418 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:47.765144 kubelet[2569]: E0213 08:06:47.764493 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:06:48.707006 env[1458]: time="2024-02-13T08:06:48.706910829Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:06:48.737065 env[1458]: time="2024-02-13T08:06:48.737008052Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:48.737330 kubelet[2569]: E0213 08:06:48.737258 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:06:48.737330 kubelet[2569]: E0213 08:06:48.737325 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:06:48.737415 kubelet[2569]: E0213 08:06:48.737349 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:48.737415 kubelet[2569]: E0213 08:06:48.737365 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:06:50.867000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:50.867000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000d7f1c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:06:50.867000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:06:50.872000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:50.872000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001d0b940 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:06:50.872000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:06:50.873000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:50.873000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000fff200 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:06:50.873000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:06:50.878000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:06:50.878000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001001440 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:06:50.878000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:06:51.707134 env[1458]: time="2024-02-13T08:06:51.706992147Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:06:51.724056 env[1458]: time="2024-02-13T08:06:51.724018178Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:51.724263 kubelet[2569]: E0213 08:06:51.724221 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:06:51.724263 kubelet[2569]: E0213 08:06:51.724248 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:06:51.724475 kubelet[2569]: E0213 08:06:51.724274 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:51.724475 kubelet[2569]: E0213 08:06:51.724294 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:06:57.706808 env[1458]: time="2024-02-13T08:06:57.706676950Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:06:57.736432 env[1458]: time="2024-02-13T08:06:57.736394413Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:06:57.736615 kubelet[2569]: E0213 08:06:57.736604 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:06:57.736809 kubelet[2569]: E0213 08:06:57.736636 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:06:57.736809 kubelet[2569]: E0213 08:06:57.736683 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:06:57.736809 kubelet[2569]: E0213 08:06:57.736700 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:07:00.706509 env[1458]: time="2024-02-13T08:07:00.706417717Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:07:00.732571 env[1458]: time="2024-02-13T08:07:00.732512184Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:00.732800 kubelet[2569]: E0213 08:07:00.732759 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:07:00.732800 kubelet[2569]: E0213 08:07:00.732785 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:07:00.732980 kubelet[2569]: E0213 08:07:00.732807 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:00.732980 kubelet[2569]: E0213 08:07:00.732824 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:07:01.706850 env[1458]: time="2024-02-13T08:07:01.706764783Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:07:01.722502 env[1458]: time="2024-02-13T08:07:01.722439189Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:01.722671 kubelet[2569]: E0213 08:07:01.722636 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:07:01.722671 kubelet[2569]: E0213 08:07:01.722664 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:07:01.722749 kubelet[2569]: E0213 08:07:01.722687 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:01.722749 kubelet[2569]: E0213 08:07:01.722706 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:07:06.336360 systemd[1]: Starting systemd-tmpfiles-clean.service... Feb 13 08:07:06.341929 systemd-tmpfiles[9538]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Feb 13 08:07:06.342153 systemd-tmpfiles[9538]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 08:07:06.342827 systemd-tmpfiles[9538]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 08:07:06.353307 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Feb 13 08:07:06.353394 systemd[1]: Finished systemd-tmpfiles-clean.service. Feb 13 08:07:06.352000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:07:06.381332 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:07:06.381421 kernel: audit: type=1130 audit(1707811626.352:1310): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:07:06.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:07:06.470678 kernel: audit: type=1131 audit(1707811626.352:1311): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:07:06.471250 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Feb 13 08:07:07.706462 env[1458]: time="2024-02-13T08:07:07.706349094Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:07:07.760434 env[1458]: time="2024-02-13T08:07:07.760365539Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:07.760718 kubelet[2569]: E0213 08:07:07.760657 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:07:07.760718 kubelet[2569]: E0213 08:07:07.760710 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:07:07.761209 kubelet[2569]: E0213 08:07:07.760772 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:07.761209 kubelet[2569]: E0213 08:07:07.760822 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:07:11.706393 env[1458]: time="2024-02-13T08:07:11.706258707Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:07:11.759848 env[1458]: time="2024-02-13T08:07:11.759751035Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:11.760068 kubelet[2569]: E0213 08:07:11.760042 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:07:11.760471 kubelet[2569]: E0213 08:07:11.760091 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:07:11.760471 kubelet[2569]: E0213 08:07:11.760144 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:11.760471 kubelet[2569]: E0213 08:07:11.760188 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:07:14.705406 env[1458]: time="2024-02-13T08:07:14.705348306Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:07:14.721953 env[1458]: time="2024-02-13T08:07:14.721884372Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:14.722182 kubelet[2569]: E0213 08:07:14.722142 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:07:14.722182 kubelet[2569]: E0213 08:07:14.722176 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:07:14.722470 kubelet[2569]: E0213 08:07:14.722210 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:14.722470 kubelet[2569]: E0213 08:07:14.722236 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:07:15.706527 env[1458]: time="2024-02-13T08:07:15.706394294Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:07:15.760768 env[1458]: time="2024-02-13T08:07:15.760684640Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:15.760976 kubelet[2569]: E0213 08:07:15.760951 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:07:15.761345 kubelet[2569]: E0213 08:07:15.761003 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:07:15.761345 kubelet[2569]: E0213 08:07:15.761069 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:15.761345 kubelet[2569]: E0213 08:07:15.761148 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:07:21.707110 env[1458]: time="2024-02-13T08:07:21.706907182Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:07:21.733937 env[1458]: time="2024-02-13T08:07:21.733873124Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:21.734060 kubelet[2569]: E0213 08:07:21.734049 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:07:21.734229 kubelet[2569]: E0213 08:07:21.734075 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:07:21.734229 kubelet[2569]: E0213 08:07:21.734096 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:21.734229 kubelet[2569]: E0213 08:07:21.734115 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:07:23.706969 env[1458]: time="2024-02-13T08:07:23.706884320Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:07:23.736803 env[1458]: time="2024-02-13T08:07:23.736756403Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:23.736966 kubelet[2569]: E0213 08:07:23.736955 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:07:23.737151 kubelet[2569]: E0213 08:07:23.736992 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:07:23.737151 kubelet[2569]: E0213 08:07:23.737020 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:23.737151 kubelet[2569]: E0213 08:07:23.737038 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:07:27.706541 env[1458]: time="2024-02-13T08:07:27.706450829Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:07:27.708143 env[1458]: time="2024-02-13T08:07:27.706454227Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:07:27.733562 env[1458]: time="2024-02-13T08:07:27.733528590Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:27.733694 env[1458]: time="2024-02-13T08:07:27.733533097Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:27.733810 kubelet[2569]: E0213 08:07:27.733739 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:07:27.733810 kubelet[2569]: E0213 08:07:27.733774 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:07:27.733810 kubelet[2569]: E0213 08:07:27.733798 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:07:27.733810 kubelet[2569]: E0213 08:07:27.733805 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:07:27.734050 kubelet[2569]: E0213 08:07:27.733820 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:27.734050 kubelet[2569]: E0213 08:07:27.733824 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:27.734050 kubelet[2569]: E0213 08:07:27.733838 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:07:27.734155 kubelet[2569]: E0213 08:07:27.733839 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:07:34.706625 env[1458]: time="2024-02-13T08:07:34.706538133Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:07:34.735780 env[1458]: time="2024-02-13T08:07:34.735673299Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:34.735942 kubelet[2569]: E0213 08:07:34.735930 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:07:34.736131 kubelet[2569]: E0213 08:07:34.735957 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:07:34.736131 kubelet[2569]: E0213 08:07:34.735979 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:34.736131 kubelet[2569]: E0213 08:07:34.735995 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:07:38.706704 env[1458]: time="2024-02-13T08:07:38.706569931Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:07:38.758212 env[1458]: time="2024-02-13T08:07:38.758126875Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:38.758415 kubelet[2569]: E0213 08:07:38.758371 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:07:38.758415 kubelet[2569]: E0213 08:07:38.758414 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:07:38.758784 kubelet[2569]: E0213 08:07:38.758478 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:38.758784 kubelet[2569]: E0213 08:07:38.758511 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:07:41.706928 env[1458]: time="2024-02-13T08:07:41.706805186Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:07:41.706928 env[1458]: time="2024-02-13T08:07:41.706820913Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:07:41.733537 env[1458]: time="2024-02-13T08:07:41.733499845Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:41.733726 kubelet[2569]: E0213 08:07:41.733689 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:07:41.733726 kubelet[2569]: E0213 08:07:41.733716 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:07:41.733924 kubelet[2569]: E0213 08:07:41.733739 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:41.733924 kubelet[2569]: E0213 08:07:41.733760 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:07:41.734081 env[1458]: time="2024-02-13T08:07:41.734062460Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:41.734147 kubelet[2569]: E0213 08:07:41.734139 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:07:41.734172 kubelet[2569]: E0213 08:07:41.734161 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:07:41.734194 kubelet[2569]: E0213 08:07:41.734178 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:41.734194 kubelet[2569]: E0213 08:07:41.734192 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:07:46.119000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.119000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000795800 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:07:46.336891 kernel: audit: type=1400 audit(1707811666.119:1312): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.336928 kernel: audit: type=1300 audit(1707811666.119:1312): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000795800 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:07:46.336945 kernel: audit: type=1327 audit(1707811666.119:1312): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:07:46.119000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:07:46.430271 kernel: audit: type=1400 audit(1707811666.119:1313): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.119000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.520573 kernel: audit: type=1300 audit(1707811666.119:1313): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001abb760 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:07:46.119000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001abb760 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:07:46.641039 kernel: audit: type=1327 audit(1707811666.119:1313): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:07:46.119000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:07:46.734608 kernel: audit: type=1400 audit(1707811666.188:1314): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.825488 kernel: audit: type=1300 audit(1707811666.188:1314): arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0093d1980 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:07:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c0093d1980 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:07:46.923930 kernel: audit: type=1327 audit(1707811666.188:1314): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:07:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:07:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:47.017717 kernel: audit: type=1400 audit(1707811666.188:1315): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00e7f0660 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:07:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:07:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00976aed0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:07:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:07:46.931000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.931000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00b5a6c30 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:07:46.931000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:07:46.931000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.931000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e60e4a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:07:46.931000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:07:46.931000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:46.931000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5b a1=c00806bd10 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:07:46.931000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:07:48.706613 env[1458]: time="2024-02-13T08:07:48.706481827Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:07:48.733607 env[1458]: time="2024-02-13T08:07:48.733543702Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:48.733835 kubelet[2569]: E0213 08:07:48.733794 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:07:48.733835 kubelet[2569]: E0213 08:07:48.733820 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:07:48.734018 kubelet[2569]: E0213 08:07:48.733843 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:48.734018 kubelet[2569]: E0213 08:07:48.733862 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:07:50.869000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:50.869000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0005d0740 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:07:50.869000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:07:50.874000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:50.874000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0006a4940 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:07:50.874000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:07:50.874000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:50.874000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001abb980 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:07:50.874000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:07:50.878000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:07:50.878000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0005d0860 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:07:50.878000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:07:52.707051 env[1458]: time="2024-02-13T08:07:52.706922883Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:07:52.707051 env[1458]: time="2024-02-13T08:07:52.706946239Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:07:52.736330 env[1458]: time="2024-02-13T08:07:52.736267980Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:52.736483 kubelet[2569]: E0213 08:07:52.736469 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:07:52.736659 kubelet[2569]: E0213 08:07:52.736505 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:07:52.736659 kubelet[2569]: E0213 08:07:52.736537 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:52.736659 kubelet[2569]: E0213 08:07:52.736561 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:07:52.736760 env[1458]: time="2024-02-13T08:07:52.736605649Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:52.736784 kubelet[2569]: E0213 08:07:52.736683 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:07:52.736784 kubelet[2569]: E0213 08:07:52.736697 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:07:52.736784 kubelet[2569]: E0213 08:07:52.736718 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:52.736784 kubelet[2569]: E0213 08:07:52.736735 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:07:56.706868 env[1458]: time="2024-02-13T08:07:56.706776000Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:07:56.758742 env[1458]: time="2024-02-13T08:07:56.758678207Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:07:56.758955 kubelet[2569]: E0213 08:07:56.758939 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:07:56.759260 kubelet[2569]: E0213 08:07:56.758982 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:07:56.759260 kubelet[2569]: E0213 08:07:56.759029 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:07:56.759260 kubelet[2569]: E0213 08:07:56.759066 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:08:02.706960 env[1458]: time="2024-02-13T08:08:02.706876272Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:08:02.733477 env[1458]: time="2024-02-13T08:08:02.733396014Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:02.733667 kubelet[2569]: E0213 08:08:02.733639 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:08:02.733858 kubelet[2569]: E0213 08:08:02.733697 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:08:02.733858 kubelet[2569]: E0213 08:08:02.733732 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:02.733858 kubelet[2569]: E0213 08:08:02.733749 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:08:04.706447 env[1458]: time="2024-02-13T08:08:04.706367783Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:08:04.732359 env[1458]: time="2024-02-13T08:08:04.732326087Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:04.732520 kubelet[2569]: E0213 08:08:04.732509 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:08:04.732707 kubelet[2569]: E0213 08:08:04.732535 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:08:04.732707 kubelet[2569]: E0213 08:08:04.732557 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:04.732707 kubelet[2569]: E0213 08:08:04.732574 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:08:07.706833 env[1458]: time="2024-02-13T08:08:07.706733515Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:08:07.733076 env[1458]: time="2024-02-13T08:08:07.733038463Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:07.733267 kubelet[2569]: E0213 08:08:07.733228 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:08:07.733267 kubelet[2569]: E0213 08:08:07.733256 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:08:07.733443 kubelet[2569]: E0213 08:08:07.733278 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:07.733443 kubelet[2569]: E0213 08:08:07.733296 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:08:10.707106 env[1458]: time="2024-02-13T08:08:10.707021891Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:08:10.736373 env[1458]: time="2024-02-13T08:08:10.736309865Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:10.736542 kubelet[2569]: E0213 08:08:10.736531 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:08:10.736721 kubelet[2569]: E0213 08:08:10.736556 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:08:10.736721 kubelet[2569]: E0213 08:08:10.736579 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:10.736721 kubelet[2569]: E0213 08:08:10.736597 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:08:15.706738 env[1458]: time="2024-02-13T08:08:15.706620801Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:08:15.766228 env[1458]: time="2024-02-13T08:08:15.766067050Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:15.766616 kubelet[2569]: E0213 08:08:15.766572 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:08:15.767415 kubelet[2569]: E0213 08:08:15.766688 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:08:15.767415 kubelet[2569]: E0213 08:08:15.766815 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:15.767415 kubelet[2569]: E0213 08:08:15.766913 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:08:17.706557 env[1458]: time="2024-02-13T08:08:17.706477194Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:08:17.722566 env[1458]: time="2024-02-13T08:08:17.722490705Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:17.722694 kubelet[2569]: E0213 08:08:17.722669 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:08:17.722847 kubelet[2569]: E0213 08:08:17.722697 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:08:17.722847 kubelet[2569]: E0213 08:08:17.722721 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:17.722847 kubelet[2569]: E0213 08:08:17.722739 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:08:18.706339 env[1458]: time="2024-02-13T08:08:18.706209664Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:08:18.732824 env[1458]: time="2024-02-13T08:08:18.732763565Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:18.733082 kubelet[2569]: E0213 08:08:18.732973 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:08:18.733082 kubelet[2569]: E0213 08:08:18.733012 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:08:18.733082 kubelet[2569]: E0213 08:08:18.733033 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:18.733082 kubelet[2569]: E0213 08:08:18.733051 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:08:19.108767 systemd[1]: Started sshd@9-145.40.90.207:22-139.178.68.195:49240.service. Feb 13 08:08:19.108000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.90.207:22-139.178.68.195:49240 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:19.135412 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:08:19.135455 kernel: audit: type=1130 audit(1707811699.108:1324): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.90.207:22-139.178.68.195:49240 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:19.252000 audit[10225]: USER_ACCT pid=10225 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.253196 sshd[10225]: Accepted publickey for core from 139.178.68.195 port 49240 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:08:19.257123 sshd[10225]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:08:19.262093 systemd-logind[1446]: New session 10 of user core. Feb 13 08:08:19.262545 systemd[1]: Started session-10.scope. Feb 13 08:08:19.255000 audit[10225]: CRED_ACQ pid=10225 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.345469 sshd[10225]: pam_unix(sshd:session): session closed for user core Feb 13 08:08:19.346867 systemd[1]: sshd@9-145.40.90.207:22-139.178.68.195:49240.service: Deactivated successfully. Feb 13 08:08:19.347317 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 08:08:19.347605 systemd-logind[1446]: Session 10 logged out. Waiting for processes to exit. Feb 13 08:08:19.348148 systemd-logind[1446]: Removed session 10. Feb 13 08:08:19.436215 kernel: audit: type=1101 audit(1707811699.252:1325): pid=10225 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.436250 kernel: audit: type=1103 audit(1707811699.255:1326): pid=10225 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.436268 kernel: audit: type=1006 audit(1707811699.255:1327): pid=10225 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Feb 13 08:08:19.495407 kernel: audit: type=1300 audit(1707811699.255:1327): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd9db38160 a2=3 a3=0 items=0 ppid=1 pid=10225 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:19.255000 audit[10225]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd9db38160 a2=3 a3=0 items=0 ppid=1 pid=10225 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:19.588348 kernel: audit: type=1327 audit(1707811699.255:1327): proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:19.255000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:19.619169 kernel: audit: type=1105 audit(1707811699.263:1328): pid=10225 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.263000 audit[10225]: USER_START pid=10225 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.714712 kernel: audit: type=1103 audit(1707811699.264:1329): pid=10227 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.264000 audit[10227]: CRED_ACQ pid=10227 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.804100 kernel: audit: type=1106 audit(1707811699.345:1330): pid=10225 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.345000 audit[10225]: USER_END pid=10225 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.899830 kernel: audit: type=1104 audit(1707811699.345:1331): pid=10225 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.345000 audit[10225]: CRED_DISP pid=10225 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:19.346000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.90.207:22-139.178.68.195:49240 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:23.706542 env[1458]: time="2024-02-13T08:08:23.706446829Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:08:23.736277 env[1458]: time="2024-02-13T08:08:23.736202174Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:23.736624 kubelet[2569]: E0213 08:08:23.736471 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:08:23.736624 kubelet[2569]: E0213 08:08:23.736496 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:08:23.736624 kubelet[2569]: E0213 08:08:23.736518 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:23.736624 kubelet[2569]: E0213 08:08:23.736545 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:08:24.349904 systemd[1]: Started sshd@10-145.40.90.207:22-139.178.68.195:49254.service. Feb 13 08:08:24.349000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.90.207:22-139.178.68.195:49254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:24.377125 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:08:24.377206 kernel: audit: type=1130 audit(1707811704.349:1333): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.90.207:22-139.178.68.195:49254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:24.495000 audit[10286]: USER_ACCT pid=10286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.495829 sshd[10286]: Accepted publickey for core from 139.178.68.195 port 49254 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:08:24.497930 sshd[10286]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:08:24.500355 systemd-logind[1446]: New session 11 of user core. Feb 13 08:08:24.500875 systemd[1]: Started session-11.scope. Feb 13 08:08:24.579645 sshd[10286]: pam_unix(sshd:session): session closed for user core Feb 13 08:08:24.581148 systemd[1]: sshd@10-145.40.90.207:22-139.178.68.195:49254.service: Deactivated successfully. Feb 13 08:08:24.581590 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 08:08:24.581994 systemd-logind[1446]: Session 11 logged out. Waiting for processes to exit. Feb 13 08:08:24.582484 systemd-logind[1446]: Removed session 11. Feb 13 08:08:24.497000 audit[10286]: CRED_ACQ pid=10286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.678650 kernel: audit: type=1101 audit(1707811704.495:1334): pid=10286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.678685 kernel: audit: type=1103 audit(1707811704.497:1335): pid=10286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.678702 kernel: audit: type=1006 audit(1707811704.497:1336): pid=10286 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Feb 13 08:08:24.737130 kernel: audit: type=1300 audit(1707811704.497:1336): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc047e4d80 a2=3 a3=0 items=0 ppid=1 pid=10286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:24.497000 audit[10286]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc047e4d80 a2=3 a3=0 items=0 ppid=1 pid=10286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:24.828885 kernel: audit: type=1327 audit(1707811704.497:1336): proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:24.497000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:24.859303 kernel: audit: type=1105 audit(1707811704.502:1337): pid=10286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.502000 audit[10286]: USER_START pid=10286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.502000 audit[10288]: CRED_ACQ pid=10288 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:25.042934 kernel: audit: type=1103 audit(1707811704.502:1338): pid=10288 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:25.042966 kernel: audit: type=1106 audit(1707811704.579:1339): pid=10286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.579000 audit[10286]: USER_END pid=10286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:25.138402 kernel: audit: type=1104 audit(1707811704.579:1340): pid=10286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.579000 audit[10286]: CRED_DISP pid=10286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:24.580000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.90.207:22-139.178.68.195:49254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:26.705365 env[1458]: time="2024-02-13T08:08:26.705304199Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:08:26.721315 env[1458]: time="2024-02-13T08:08:26.721279341Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:26.721460 kubelet[2569]: E0213 08:08:26.721448 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:08:26.721617 kubelet[2569]: E0213 08:08:26.721475 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:08:26.721617 kubelet[2569]: E0213 08:08:26.721500 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:26.721617 kubelet[2569]: E0213 08:08:26.721519 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:08:29.589053 systemd[1]: Started sshd@11-145.40.90.207:22-139.178.68.195:37362.service. Feb 13 08:08:29.588000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.90.207:22-139.178.68.195:37362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:29.615856 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:08:29.615946 kernel: audit: type=1130 audit(1707811709.588:1342): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.90.207:22-139.178.68.195:37362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:29.733000 audit[10340]: USER_ACCT pid=10340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.733945 sshd[10340]: Accepted publickey for core from 139.178.68.195 port 37362 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:08:29.735931 sshd[10340]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:08:29.738339 systemd-logind[1446]: New session 12 of user core. Feb 13 08:08:29.738894 systemd[1]: Started session-12.scope. Feb 13 08:08:29.819124 sshd[10340]: pam_unix(sshd:session): session closed for user core Feb 13 08:08:29.820468 systemd[1]: sshd@11-145.40.90.207:22-139.178.68.195:37362.service: Deactivated successfully. Feb 13 08:08:29.820946 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 08:08:29.821356 systemd-logind[1446]: Session 12 logged out. Waiting for processes to exit. Feb 13 08:08:29.821932 systemd-logind[1446]: Removed session 12. Feb 13 08:08:29.735000 audit[10340]: CRED_ACQ pid=10340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.917838 kernel: audit: type=1101 audit(1707811709.733:1343): pid=10340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.917877 kernel: audit: type=1103 audit(1707811709.735:1344): pid=10340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.917896 kernel: audit: type=1006 audit(1707811709.735:1345): pid=10340 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Feb 13 08:08:29.976442 kernel: audit: type=1300 audit(1707811709.735:1345): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe937aae60 a2=3 a3=0 items=0 ppid=1 pid=10340 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:29.735000 audit[10340]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe937aae60 a2=3 a3=0 items=0 ppid=1 pid=10340 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:30.068376 kernel: audit: type=1327 audit(1707811709.735:1345): proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:29.735000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:30.098809 kernel: audit: type=1105 audit(1707811709.740:1346): pid=10340 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.740000 audit[10340]: USER_START pid=10340 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:30.193182 kernel: audit: type=1103 audit(1707811709.740:1347): pid=10342 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.740000 audit[10342]: CRED_ACQ pid=10342 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:30.282293 kernel: audit: type=1106 audit(1707811709.819:1348): pid=10340 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.819000 audit[10340]: USER_END pid=10340 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.819000 audit[10340]: CRED_DISP pid=10340 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:30.466939 kernel: audit: type=1104 audit(1707811709.819:1349): pid=10340 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:29.819000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.90.207:22-139.178.68.195:37362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:31.706714 env[1458]: time="2024-02-13T08:08:31.706601340Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:08:31.736830 env[1458]: time="2024-02-13T08:08:31.736759531Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:31.737038 kubelet[2569]: E0213 08:08:31.736998 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:08:31.737038 kubelet[2569]: E0213 08:08:31.737022 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:08:31.737222 kubelet[2569]: E0213 08:08:31.737043 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:31.737222 kubelet[2569]: E0213 08:08:31.737060 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:08:33.706961 env[1458]: time="2024-02-13T08:08:33.706875552Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:08:33.735857 env[1458]: time="2024-02-13T08:08:33.735822519Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:33.736094 kubelet[2569]: E0213 08:08:33.736061 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:08:33.736261 kubelet[2569]: E0213 08:08:33.736102 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:08:33.736261 kubelet[2569]: E0213 08:08:33.736124 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:33.736261 kubelet[2569]: E0213 08:08:33.736143 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:08:34.706770 env[1458]: time="2024-02-13T08:08:34.706580992Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:08:34.733019 env[1458]: time="2024-02-13T08:08:34.732987488Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:34.733244 kubelet[2569]: E0213 08:08:34.733152 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:08:34.733244 kubelet[2569]: E0213 08:08:34.733180 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:08:34.733244 kubelet[2569]: E0213 08:08:34.733203 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:34.733244 kubelet[2569]: E0213 08:08:34.733222 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:08:34.828193 systemd[1]: Started sshd@12-145.40.90.207:22-139.178.68.195:37374.service. Feb 13 08:08:34.827000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.90.207:22-139.178.68.195:37374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:34.855034 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:08:34.855128 kernel: audit: type=1130 audit(1707811714.827:1351): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.90.207:22-139.178.68.195:37374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:34.972000 audit[10458]: USER_ACCT pid=10458 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:34.973118 sshd[10458]: Accepted publickey for core from 139.178.68.195 port 37374 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:08:34.975376 sshd[10458]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:08:34.980269 systemd-logind[1446]: New session 13 of user core. Feb 13 08:08:34.980728 systemd[1]: Started session-13.scope. Feb 13 08:08:35.061278 sshd[10458]: pam_unix(sshd:session): session closed for user core Feb 13 08:08:35.062690 systemd[1]: sshd@12-145.40.90.207:22-139.178.68.195:37374.service: Deactivated successfully. Feb 13 08:08:35.063110 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 08:08:35.063434 systemd-logind[1446]: Session 13 logged out. Waiting for processes to exit. Feb 13 08:08:35.063936 systemd-logind[1446]: Removed session 13. Feb 13 08:08:34.974000 audit[10458]: CRED_ACQ pid=10458 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.156291 kernel: audit: type=1101 audit(1707811714.972:1352): pid=10458 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.156336 kernel: audit: type=1103 audit(1707811714.974:1353): pid=10458 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.156353 kernel: audit: type=1006 audit(1707811714.974:1354): pid=10458 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Feb 13 08:08:35.214910 kernel: audit: type=1300 audit(1707811714.974:1354): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe46f6c460 a2=3 a3=0 items=0 ppid=1 pid=10458 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:34.974000 audit[10458]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe46f6c460 a2=3 a3=0 items=0 ppid=1 pid=10458 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:35.306824 kernel: audit: type=1327 audit(1707811714.974:1354): proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:34.974000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:35.337255 kernel: audit: type=1105 audit(1707811714.982:1355): pid=10458 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:34.982000 audit[10458]: USER_START pid=10458 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.431787 kernel: audit: type=1103 audit(1707811714.982:1356): pid=10460 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:34.982000 audit[10460]: CRED_ACQ pid=10460 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.520883 kernel: audit: type=1106 audit(1707811715.061:1357): pid=10458 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.061000 audit[10458]: USER_END pid=10458 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.616308 kernel: audit: type=1104 audit(1707811715.061:1358): pid=10458 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.061000 audit[10458]: CRED_DISP pid=10458 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:35.062000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.90.207:22-139.178.68.195:37374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:40.070765 systemd[1]: Started sshd@13-145.40.90.207:22-139.178.68.195:50118.service. Feb 13 08:08:40.070000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.90.207:22-139.178.68.195:50118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:40.098000 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:08:40.098076 kernel: audit: type=1130 audit(1707811720.070:1360): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.90.207:22-139.178.68.195:50118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:40.214000 audit[10484]: USER_ACCT pid=10484 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.215986 sshd[10484]: Accepted publickey for core from 139.178.68.195 port 50118 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:08:40.219951 sshd[10484]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:08:40.224704 systemd-logind[1446]: New session 14 of user core. Feb 13 08:08:40.225192 systemd[1]: Started session-14.scope. Feb 13 08:08:40.303607 sshd[10484]: pam_unix(sshd:session): session closed for user core Feb 13 08:08:40.305121 systemd[1]: sshd@13-145.40.90.207:22-139.178.68.195:50118.service: Deactivated successfully. Feb 13 08:08:40.305570 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 08:08:40.305935 systemd-logind[1446]: Session 14 logged out. Waiting for processes to exit. Feb 13 08:08:40.306415 systemd-logind[1446]: Removed session 14. Feb 13 08:08:40.218000 audit[10484]: CRED_ACQ pid=10484 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.397455 kernel: audit: type=1101 audit(1707811720.214:1361): pid=10484 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.397496 kernel: audit: type=1103 audit(1707811720.218:1362): pid=10484 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.397512 kernel: audit: type=1006 audit(1707811720.218:1363): pid=10484 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Feb 13 08:08:40.456232 kernel: audit: type=1300 audit(1707811720.218:1363): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef955c150 a2=3 a3=0 items=0 ppid=1 pid=10484 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:40.218000 audit[10484]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef955c150 a2=3 a3=0 items=0 ppid=1 pid=10484 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:40.548226 kernel: audit: type=1327 audit(1707811720.218:1363): proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:40.218000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:40.578713 kernel: audit: type=1105 audit(1707811720.226:1364): pid=10484 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.226000 audit[10484]: USER_START pid=10484 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.673157 kernel: audit: type=1103 audit(1707811720.227:1365): pid=10486 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.227000 audit[10486]: CRED_ACQ pid=10486 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.762326 kernel: audit: type=1106 audit(1707811720.303:1366): pid=10484 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.303000 audit[10484]: USER_END pid=10484 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.857806 kernel: audit: type=1104 audit(1707811720.303:1367): pid=10484 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.303000 audit[10484]: CRED_DISP pid=10484 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:40.304000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.90.207:22-139.178.68.195:50118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:41.706277 env[1458]: time="2024-02-13T08:08:41.706151993Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:08:41.733389 env[1458]: time="2024-02-13T08:08:41.733320782Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:41.733548 kubelet[2569]: E0213 08:08:41.733535 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:08:41.733741 kubelet[2569]: E0213 08:08:41.733564 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:08:41.733741 kubelet[2569]: E0213 08:08:41.733587 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:41.733741 kubelet[2569]: E0213 08:08:41.733604 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:08:45.313593 systemd[1]: Started sshd@14-145.40.90.207:22-139.178.68.195:50126.service. Feb 13 08:08:45.313000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.90.207:22-139.178.68.195:50126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:45.356202 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:08:45.356345 kernel: audit: type=1130 audit(1707811725.313:1369): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.90.207:22-139.178.68.195:50126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:45.388789 sshd[10538]: Accepted publickey for core from 139.178.68.195 port 50126 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:08:45.390558 sshd[10538]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:08:45.393051 systemd-logind[1446]: New session 15 of user core. Feb 13 08:08:45.393571 systemd[1]: Started session-15.scope. Feb 13 08:08:45.388000 audit[10538]: USER_ACCT pid=10538 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.472348 sshd[10538]: pam_unix(sshd:session): session closed for user core Feb 13 08:08:45.473686 systemd[1]: sshd@14-145.40.90.207:22-139.178.68.195:50126.service: Deactivated successfully. Feb 13 08:08:45.474163 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 08:08:45.474468 systemd-logind[1446]: Session 15 logged out. Waiting for processes to exit. Feb 13 08:08:45.474956 systemd-logind[1446]: Removed session 15. Feb 13 08:08:45.537171 kernel: audit: type=1101 audit(1707811725.388:1370): pid=10538 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.537208 kernel: audit: type=1103 audit(1707811725.389:1371): pid=10538 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.389000 audit[10538]: CRED_ACQ pid=10538 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.627717 kernel: audit: type=1006 audit(1707811725.389:1372): pid=10538 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Feb 13 08:08:45.686333 kernel: audit: type=1300 audit(1707811725.389:1372): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4f6f8b20 a2=3 a3=0 items=0 ppid=1 pid=10538 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:45.389000 audit[10538]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4f6f8b20 a2=3 a3=0 items=0 ppid=1 pid=10538 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:45.705859 env[1458]: time="2024-02-13T08:08:45.705834605Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:08:45.717756 env[1458]: time="2024-02-13T08:08:45.717721024Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:45.717920 kubelet[2569]: E0213 08:08:45.717909 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:08:45.718110 kubelet[2569]: E0213 08:08:45.717938 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:08:45.718110 kubelet[2569]: E0213 08:08:45.717970 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:45.718110 kubelet[2569]: E0213 08:08:45.717996 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:08:45.778375 kernel: audit: type=1327 audit(1707811725.389:1372): proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:45.389000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:45.808836 kernel: audit: type=1105 audit(1707811725.395:1373): pid=10538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.395000 audit[10538]: USER_START pid=10538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.903300 kernel: audit: type=1103 audit(1707811725.395:1374): pid=10540 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.395000 audit[10540]: CRED_ACQ pid=10540 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.472000 audit[10538]: USER_END pid=10538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:46.088027 kernel: audit: type=1106 audit(1707811725.472:1375): pid=10538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:46.088125 kernel: audit: type=1104 audit(1707811725.472:1376): pid=10538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.472000 audit[10538]: CRED_DISP pid=10538 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:45.473000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.90.207:22-139.178.68.195:50126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:46.119000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:46.119000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0006a5ee0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:08:46.119000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:08:46.119000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:46.119000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002305680 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:08:46.119000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:08:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0062603c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:08:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:08:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00407c380 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:08:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:08:46.188000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:46.188000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c006260420 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:08:46.188000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:08:46.932000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:46.932000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0062605a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:08:46.932000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:08:46.932000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:46.932000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=62 a1=c0056d28c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:08:46.932000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:08:46.932000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:46.932000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00b5a7470 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:08:46.932000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:08:47.706314 env[1458]: time="2024-02-13T08:08:47.706240570Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:08:47.739097 env[1458]: time="2024-02-13T08:08:47.739009012Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:47.739405 kubelet[2569]: E0213 08:08:47.739377 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:08:47.739846 kubelet[2569]: E0213 08:08:47.739434 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:08:47.739846 kubelet[2569]: E0213 08:08:47.739493 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:47.739846 kubelet[2569]: E0213 08:08:47.739542 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:08:48.706934 env[1458]: time="2024-02-13T08:08:48.706846317Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:08:48.762775 env[1458]: time="2024-02-13T08:08:48.762675140Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:48.763057 kubelet[2569]: E0213 08:08:48.762960 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:08:48.763057 kubelet[2569]: E0213 08:08:48.763031 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:08:48.763553 kubelet[2569]: E0213 08:08:48.763110 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:48.763553 kubelet[2569]: E0213 08:08:48.763183 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:08:50.475628 systemd[1]: Started sshd@15-145.40.90.207:22-139.178.68.195:40236.service. Feb 13 08:08:50.475000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-145.40.90.207:22-139.178.68.195:40236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:50.502894 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:08:50.502967 kernel: audit: type=1130 audit(1707811730.475:1386): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-145.40.90.207:22-139.178.68.195:40236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:50.620000 audit[10652]: USER_ACCT pid=10652 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:50.620961 sshd[10652]: Accepted publickey for core from 139.178.68.195 port 40236 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:08:50.622153 sshd[10652]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:08:50.624593 systemd-logind[1446]: New session 16 of user core. Feb 13 08:08:50.625130 systemd[1]: Started session-16.scope. Feb 13 08:08:50.621000 audit[10652]: CRED_ACQ pid=10652 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:50.802920 kernel: audit: type=1101 audit(1707811730.620:1387): pid=10652 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:50.802950 kernel: audit: type=1103 audit(1707811730.621:1388): pid=10652 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:50.802970 kernel: audit: type=1006 audit(1707811730.621:1389): pid=10652 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Feb 13 08:08:50.861542 kernel: audit: type=1300 audit(1707811730.621:1389): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7bd22990 a2=3 a3=0 items=0 ppid=1 pid=10652 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:50.621000 audit[10652]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7bd22990 a2=3 a3=0 items=0 ppid=1 pid=10652 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:50.896002 sshd[10652]: pam_unix(sshd:session): session closed for user core Feb 13 08:08:50.897367 systemd[1]: sshd@15-145.40.90.207:22-139.178.68.195:40236.service: Deactivated successfully. Feb 13 08:08:50.897805 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 08:08:50.898192 systemd-logind[1446]: Session 16 logged out. Waiting for processes to exit. Feb 13 08:08:50.898621 systemd-logind[1446]: Removed session 16. Feb 13 08:08:50.621000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:50.984031 kernel: audit: type=1327 audit(1707811730.621:1389): proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:50.984065 kernel: audit: type=1105 audit(1707811730.626:1390): pid=10652 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:50.626000 audit[10652]: USER_START pid=10652 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:51.078479 kernel: audit: type=1103 audit(1707811730.627:1391): pid=10654 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:50.627000 audit[10654]: CRED_ACQ pid=10654 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:51.167702 kernel: audit: type=1400 audit(1707811730.870:1392): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:51.257298 kernel: audit: type=1300 audit(1707811730.870:1392): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000353aa0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:08:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000353aa0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:08:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:08:50.875000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:50.875000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000353ac0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:08:50.875000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:08:50.875000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:50.875000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c000353ae0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:08:50.875000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:08:50.879000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:08:50.879000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001bb7b40 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:08:50.879000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:08:50.895000 audit[10652]: USER_END pid=10652 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:50.896000 audit[10652]: CRED_DISP pid=10652 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:50.896000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-145.40.90.207:22-139.178.68.195:40236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:53.706780 env[1458]: time="2024-02-13T08:08:53.706670706Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:08:53.733795 env[1458]: time="2024-02-13T08:08:53.733760563Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:08:53.733989 kubelet[2569]: E0213 08:08:53.733950 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:08:53.733989 kubelet[2569]: E0213 08:08:53.733974 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:08:53.734183 kubelet[2569]: E0213 08:08:53.733997 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:08:53.734183 kubelet[2569]: E0213 08:08:53.734016 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:08:55.905629 systemd[1]: Started sshd@16-145.40.90.207:22-139.178.68.195:40252.service. Feb 13 08:08:55.905000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-145.40.90.207:22-139.178.68.195:40252 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:55.932682 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:08:55.932807 kernel: audit: type=1130 audit(1707811735.905:1399): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-145.40.90.207:22-139.178.68.195:40252 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:08:56.050000 audit[10709]: USER_ACCT pid=10709 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.050993 sshd[10709]: Accepted publickey for core from 139.178.68.195 port 40252 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:08:56.052462 sshd[10709]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:08:56.054867 systemd-logind[1446]: New session 17 of user core. Feb 13 08:08:56.055323 systemd[1]: Started session-17.scope. Feb 13 08:08:56.051000 audit[10709]: CRED_ACQ pid=10709 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.144838 sshd[10709]: pam_unix(sshd:session): session closed for user core Feb 13 08:08:56.146100 systemd[1]: sshd@16-145.40.90.207:22-139.178.68.195:40252.service: Deactivated successfully. Feb 13 08:08:56.146530 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 08:08:56.146920 systemd-logind[1446]: Session 17 logged out. Waiting for processes to exit. Feb 13 08:08:56.147455 systemd-logind[1446]: Removed session 17. Feb 13 08:08:56.233229 kernel: audit: type=1101 audit(1707811736.050:1400): pid=10709 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.233268 kernel: audit: type=1103 audit(1707811736.051:1401): pid=10709 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.233287 kernel: audit: type=1006 audit(1707811736.051:1402): pid=10709 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Feb 13 08:08:56.291873 kernel: audit: type=1300 audit(1707811736.051:1402): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6342b8a0 a2=3 a3=0 items=0 ppid=1 pid=10709 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:56.051000 audit[10709]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6342b8a0 a2=3 a3=0 items=0 ppid=1 pid=10709 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:08:56.383909 kernel: audit: type=1327 audit(1707811736.051:1402): proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:56.051000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:08:56.414399 kernel: audit: type=1105 audit(1707811736.057:1403): pid=10709 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.057000 audit[10709]: USER_START pid=10709 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.509161 kernel: audit: type=1103 audit(1707811736.057:1404): pid=10711 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.057000 audit[10711]: CRED_ACQ pid=10711 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.598485 kernel: audit: type=1106 audit(1707811736.144:1405): pid=10709 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.144000 audit[10709]: USER_END pid=10709 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.694028 kernel: audit: type=1104 audit(1707811736.144:1406): pid=10709 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.144000 audit[10709]: CRED_DISP pid=10709 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:08:56.145000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-145.40.90.207:22-139.178.68.195:40252 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:00.453096 systemd[1]: Started sshd@17-145.40.90.207:22-101.36.65.131:28423.service. Feb 13 08:09:00.452000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-145.40.90.207:22-101.36.65.131:28423 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:00.706590 env[1458]: time="2024-02-13T08:09:00.706343167Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:09:00.732309 env[1458]: time="2024-02-13T08:09:00.732246185Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:00.732429 kubelet[2569]: E0213 08:09:00.732417 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:09:00.732589 kubelet[2569]: E0213 08:09:00.732444 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:09:00.732589 kubelet[2569]: E0213 08:09:00.732467 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:00.732589 kubelet[2569]: E0213 08:09:00.732488 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:09:01.153775 systemd[1]: Started sshd@18-145.40.90.207:22-139.178.68.195:47068.service. Feb 13 08:09:01.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-145.40.90.207:22-139.178.68.195:47068 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:01.180793 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:09:01.180828 kernel: audit: type=1130 audit(1707811741.153:1409): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-145.40.90.207:22-139.178.68.195:47068 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:01.297000 audit[10765]: USER_ACCT pid=10765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.298521 sshd[10765]: Accepted publickey for core from 139.178.68.195 port 47068 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:01.299943 sshd[10765]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:01.302298 systemd-logind[1446]: New session 18 of user core. Feb 13 08:09:01.302761 systemd[1]: Started session-18.scope. Feb 13 08:09:01.313419 sshd[10735]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.36.65.131 user=root Feb 13 08:09:01.382038 sshd[10765]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:01.383374 systemd[1]: sshd@18-145.40.90.207:22-139.178.68.195:47068.service: Deactivated successfully. Feb 13 08:09:01.383818 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 08:09:01.384216 systemd-logind[1446]: Session 18 logged out. Waiting for processes to exit. Feb 13 08:09:01.384613 systemd-logind[1446]: Removed session 18. Feb 13 08:09:01.299000 audit[10765]: CRED_ACQ pid=10765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.480710 kernel: audit: type=1101 audit(1707811741.297:1410): pid=10765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.480749 kernel: audit: type=1103 audit(1707811741.299:1411): pid=10765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.480767 kernel: audit: type=1006 audit(1707811741.299:1412): pid=10765 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Feb 13 08:09:01.539348 kernel: audit: type=1300 audit(1707811741.299:1412): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffccc090f30 a2=3 a3=0 items=0 ppid=1 pid=10765 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:01.299000 audit[10765]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffccc090f30 a2=3 a3=0 items=0 ppid=1 pid=10765 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:01.631521 kernel: audit: type=1327 audit(1707811741.299:1412): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:01.299000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:01.662028 kernel: audit: type=1105 audit(1707811741.304:1413): pid=10765 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.304000 audit[10765]: USER_START pid=10765 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.705849 env[1458]: time="2024-02-13T08:09:01.705827906Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:09:01.717873 env[1458]: time="2024-02-13T08:09:01.717836232Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:01.718105 kubelet[2569]: E0213 08:09:01.718030 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:09:01.718105 kubelet[2569]: E0213 08:09:01.718060 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:09:01.718105 kubelet[2569]: E0213 08:09:01.718082 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:01.718105 kubelet[2569]: E0213 08:09:01.718104 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:09:01.756552 kernel: audit: type=1103 audit(1707811741.304:1414): pid=10767 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.304000 audit[10767]: CRED_ACQ pid=10767 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.845858 kernel: audit: type=1100 audit(1707811741.312:1415): pid=10735 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=101.36.65.131 addr=101.36.65.131 terminal=ssh res=failed' Feb 13 08:09:01.312000 audit[10735]: USER_AUTH pid=10735 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=101.36.65.131 addr=101.36.65.131 terminal=ssh res=failed' Feb 13 08:09:01.934318 kernel: audit: type=1106 audit(1707811741.382:1416): pid=10765 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.382000 audit[10765]: USER_END pid=10765 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.382000 audit[10765]: CRED_DISP pid=10765 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:01.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-145.40.90.207:22-139.178.68.195:47068 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:02.706834 env[1458]: time="2024-02-13T08:09:02.706705468Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:09:02.733572 env[1458]: time="2024-02-13T08:09:02.733536048Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:02.733853 kubelet[2569]: E0213 08:09:02.733753 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:09:02.733853 kubelet[2569]: E0213 08:09:02.733780 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:09:02.733853 kubelet[2569]: E0213 08:09:02.733802 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:02.733853 kubelet[2569]: E0213 08:09:02.733820 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:09:03.957405 sshd[10735]: Failed password for root from 101.36.65.131 port 28423 ssh2 Feb 13 08:09:04.707038 env[1458]: time="2024-02-13T08:09:04.706938551Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:09:04.733271 env[1458]: time="2024-02-13T08:09:04.733237150Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:04.733428 kubelet[2569]: E0213 08:09:04.733416 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:09:04.733589 kubelet[2569]: E0213 08:09:04.733446 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:09:04.733589 kubelet[2569]: E0213 08:09:04.733467 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:04.733589 kubelet[2569]: E0213 08:09:04.733483 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:09:04.911845 sshd[10735]: Received disconnect from 101.36.65.131 port 28423:11: Bye Bye [preauth] Feb 13 08:09:04.911845 sshd[10735]: Disconnected from authenticating user root 101.36.65.131 port 28423 [preauth] Feb 13 08:09:04.914413 systemd[1]: sshd@17-145.40.90.207:22-101.36.65.131:28423.service: Deactivated successfully. Feb 13 08:09:04.914000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-145.40.90.207:22-101.36.65.131:28423 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:06.391580 systemd[1]: Started sshd@19-145.40.90.207:22-139.178.68.195:36948.service. Feb 13 08:09:06.391000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-145.40.90.207:22-139.178.68.195:36948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:06.418160 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:09:06.418216 kernel: audit: type=1130 audit(1707811746.391:1420): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-145.40.90.207:22-139.178.68.195:36948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:06.535000 audit[10880]: USER_ACCT pid=10880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.536635 sshd[10880]: Accepted publickey for core from 139.178.68.195 port 36948 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:06.537946 sshd[10880]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:06.540298 systemd-logind[1446]: New session 19 of user core. Feb 13 08:09:06.540868 systemd[1]: Started session-19.scope. Feb 13 08:09:06.618515 sshd[10880]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:06.620045 systemd[1]: sshd@19-145.40.90.207:22-139.178.68.195:36948.service: Deactivated successfully. Feb 13 08:09:06.620470 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 08:09:06.620876 systemd-logind[1446]: Session 19 logged out. Waiting for processes to exit. Feb 13 08:09:06.621436 systemd-logind[1446]: Removed session 19. Feb 13 08:09:06.537000 audit[10880]: CRED_ACQ pid=10880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.719503 kernel: audit: type=1101 audit(1707811746.535:1421): pid=10880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.719543 kernel: audit: type=1103 audit(1707811746.537:1422): pid=10880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.719562 kernel: audit: type=1006 audit(1707811746.537:1423): pid=10880 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Feb 13 08:09:06.778457 kernel: audit: type=1300 audit(1707811746.537:1423): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce1dfd050 a2=3 a3=0 items=0 ppid=1 pid=10880 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:06.537000 audit[10880]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce1dfd050 a2=3 a3=0 items=0 ppid=1 pid=10880 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:06.870934 kernel: audit: type=1327 audit(1707811746.537:1423): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:06.537000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:06.901563 kernel: audit: type=1105 audit(1707811746.542:1424): pid=10880 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.542000 audit[10880]: USER_START pid=10880 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.996535 kernel: audit: type=1103 audit(1707811746.543:1425): pid=10882 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.543000 audit[10882]: CRED_ACQ pid=10882 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:07.086233 kernel: audit: type=1106 audit(1707811746.618:1426): pid=10880 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.618000 audit[10880]: USER_END pid=10880 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:07.182247 kernel: audit: type=1104 audit(1707811746.618:1427): pid=10880 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.618000 audit[10880]: CRED_DISP pid=10880 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:06.619000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-145.40.90.207:22-139.178.68.195:36948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:11.628305 systemd[1]: Started sshd@20-145.40.90.207:22-139.178.68.195:36956.service. Feb 13 08:09:11.627000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.90.207:22-139.178.68.195:36956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:11.655252 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:09:11.655317 kernel: audit: type=1130 audit(1707811751.627:1429): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.90.207:22-139.178.68.195:36956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:11.773000 audit[10906]: USER_ACCT pid=10906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.774909 sshd[10906]: Accepted publickey for core from 139.178.68.195 port 36956 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:11.776801 sshd[10906]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:11.781508 systemd-logind[1446]: New session 20 of user core. Feb 13 08:09:11.781962 systemd[1]: Started session-20.scope. Feb 13 08:09:11.861350 sshd[10906]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:11.862740 systemd[1]: sshd@20-145.40.90.207:22-139.178.68.195:36956.service: Deactivated successfully. Feb 13 08:09:11.863164 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 08:09:11.863533 systemd-logind[1446]: Session 20 logged out. Waiting for processes to exit. Feb 13 08:09:11.864084 systemd-logind[1446]: Removed session 20. Feb 13 08:09:11.775000 audit[10906]: CRED_ACQ pid=10906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.957603 kernel: audit: type=1101 audit(1707811751.773:1430): pid=10906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.957644 kernel: audit: type=1103 audit(1707811751.775:1431): pid=10906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.957666 kernel: audit: type=1006 audit(1707811751.775:1432): pid=10906 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Feb 13 08:09:12.016646 kernel: audit: type=1300 audit(1707811751.775:1432): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc582c7300 a2=3 a3=0 items=0 ppid=1 pid=10906 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:11.775000 audit[10906]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc582c7300 a2=3 a3=0 items=0 ppid=1 pid=10906 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:11.775000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:12.139887 kernel: audit: type=1327 audit(1707811751.775:1432): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:12.139919 kernel: audit: type=1105 audit(1707811751.783:1433): pid=10906 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.783000 audit[10906]: USER_START pid=10906 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:12.234902 kernel: audit: type=1103 audit(1707811751.784:1434): pid=10908 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.784000 audit[10908]: CRED_ACQ pid=10908 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:12.324095 kernel: audit: type=1106 audit(1707811751.861:1435): pid=10906 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.861000 audit[10906]: USER_END pid=10906 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:12.419603 kernel: audit: type=1104 audit(1707811751.861:1436): pid=10906 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.861000 audit[10906]: CRED_DISP pid=10906 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:11.862000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.90.207:22-139.178.68.195:36956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:13.706603 env[1458]: time="2024-02-13T08:09:13.706520441Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:09:13.753337 env[1458]: time="2024-02-13T08:09:13.753280608Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:13.753562 kubelet[2569]: E0213 08:09:13.753543 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:09:13.753895 kubelet[2569]: E0213 08:09:13.753583 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:09:13.753895 kubelet[2569]: E0213 08:09:13.753625 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:13.753895 kubelet[2569]: E0213 08:09:13.753668 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:09:15.706901 env[1458]: time="2024-02-13T08:09:15.706779962Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:09:15.761077 env[1458]: time="2024-02-13T08:09:15.760971500Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:15.761296 kubelet[2569]: E0213 08:09:15.761264 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:09:15.761681 kubelet[2569]: E0213 08:09:15.761315 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:09:15.761681 kubelet[2569]: E0213 08:09:15.761369 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:15.761681 kubelet[2569]: E0213 08:09:15.761410 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:09:16.705087 env[1458]: time="2024-02-13T08:09:16.705058856Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:09:16.705087 env[1458]: time="2024-02-13T08:09:16.705058826Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:09:16.721544 env[1458]: time="2024-02-13T08:09:16.721479866Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:16.721544 env[1458]: time="2024-02-13T08:09:16.721479853Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:16.721808 kubelet[2569]: E0213 08:09:16.721666 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:09:16.721808 kubelet[2569]: E0213 08:09:16.721696 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:09:16.721808 kubelet[2569]: E0213 08:09:16.721720 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:16.721808 kubelet[2569]: E0213 08:09:16.721666 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:09:16.721933 kubelet[2569]: E0213 08:09:16.721740 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:09:16.721933 kubelet[2569]: E0213 08:09:16.721750 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:09:16.721933 kubelet[2569]: E0213 08:09:16.721770 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:16.721933 kubelet[2569]: E0213 08:09:16.721785 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:09:16.870789 systemd[1]: Started sshd@21-145.40.90.207:22-139.178.68.195:45798.service. Feb 13 08:09:16.870000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.90.207:22-139.178.68.195:45798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:16.897043 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:09:16.897079 kernel: audit: type=1130 audit(1707811756.870:1438): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.90.207:22-139.178.68.195:45798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:17.014000 audit[11046]: USER_ACCT pid=11046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.014954 sshd[11046]: Accepted publickey for core from 139.178.68.195 port 45798 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:17.016883 sshd[11046]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:17.019508 systemd-logind[1446]: New session 21 of user core. Feb 13 08:09:17.020510 systemd[1]: Started session-21.scope. Feb 13 08:09:17.101638 sshd[11046]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:17.103490 systemd[1]: sshd@21-145.40.90.207:22-139.178.68.195:45798.service: Deactivated successfully. Feb 13 08:09:17.104302 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 08:09:17.104920 systemd-logind[1446]: Session 21 logged out. Waiting for processes to exit. Feb 13 08:09:17.105416 systemd-logind[1446]: Removed session 21. Feb 13 08:09:17.015000 audit[11046]: CRED_ACQ pid=11046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.199222 kernel: audit: type=1101 audit(1707811757.014:1439): pid=11046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.199260 kernel: audit: type=1103 audit(1707811757.015:1440): pid=11046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.199277 kernel: audit: type=1006 audit(1707811757.015:1441): pid=11046 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Feb 13 08:09:17.257881 kernel: audit: type=1300 audit(1707811757.015:1441): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff2df14fc0 a2=3 a3=0 items=0 ppid=1 pid=11046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:17.015000 audit[11046]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff2df14fc0 a2=3 a3=0 items=0 ppid=1 pid=11046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:17.015000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:17.380445 kernel: audit: type=1327 audit(1707811757.015:1441): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:17.380481 kernel: audit: type=1105 audit(1707811757.022:1442): pid=11046 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.022000 audit[11046]: USER_START pid=11046 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.474978 kernel: audit: type=1103 audit(1707811757.023:1443): pid=11048 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.023000 audit[11048]: CRED_ACQ pid=11048 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.100000 audit[11046]: USER_END pid=11046 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.659803 kernel: audit: type=1106 audit(1707811757.100:1444): pid=11046 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.659878 kernel: audit: type=1104 audit(1707811757.100:1445): pid=11046 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.100000 audit[11046]: CRED_DISP pid=11046 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:17.100000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.90.207:22-139.178.68.195:45798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:22.111083 systemd[1]: Started sshd@22-145.40.90.207:22-139.178.68.195:45804.service. Feb 13 08:09:22.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.90.207:22-139.178.68.195:45804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:22.137864 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:09:22.137945 kernel: audit: type=1130 audit(1707811762.110:1447): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.90.207:22-139.178.68.195:45804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:22.255000 audit[11072]: USER_ACCT pid=11072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.256657 sshd[11072]: Accepted publickey for core from 139.178.68.195 port 45804 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:22.257943 sshd[11072]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:22.260750 systemd-logind[1446]: New session 22 of user core. Feb 13 08:09:22.261839 systemd[1]: Started session-22.scope. Feb 13 08:09:22.341823 sshd[11072]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:22.343750 systemd[1]: sshd@22-145.40.90.207:22-139.178.68.195:45804.service: Deactivated successfully. Feb 13 08:09:22.344203 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 08:09:22.344530 systemd-logind[1446]: Session 22 logged out. Waiting for processes to exit. Feb 13 08:09:22.345077 systemd-logind[1446]: Removed session 22. Feb 13 08:09:22.257000 audit[11072]: CRED_ACQ pid=11072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.438536 kernel: audit: type=1101 audit(1707811762.255:1448): pid=11072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.438575 kernel: audit: type=1103 audit(1707811762.257:1449): pid=11072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.438591 kernel: audit: type=1006 audit(1707811762.257:1450): pid=11072 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Feb 13 08:09:22.497167 kernel: audit: type=1300 audit(1707811762.257:1450): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd44dc9970 a2=3 a3=0 items=0 ppid=1 pid=11072 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:22.257000 audit[11072]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd44dc9970 a2=3 a3=0 items=0 ppid=1 pid=11072 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:22.589249 kernel: audit: type=1327 audit(1707811762.257:1450): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:22.257000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:22.619711 kernel: audit: type=1105 audit(1707811762.263:1451): pid=11072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.263000 audit[11072]: USER_START pid=11072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.714092 kernel: audit: type=1103 audit(1707811762.264:1452): pid=11074 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.264000 audit[11074]: CRED_ACQ pid=11074 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.342000 audit[11072]: USER_END pid=11072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.898719 kernel: audit: type=1106 audit(1707811762.342:1453): pid=11072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.898793 kernel: audit: type=1104 audit(1707811762.342:1454): pid=11072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.342000 audit[11072]: CRED_DISP pid=11072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:22.343000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.90.207:22-139.178.68.195:45804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:24.706184 env[1458]: time="2024-02-13T08:09:24.706087638Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:09:24.732088 env[1458]: time="2024-02-13T08:09:24.732023363Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:24.732274 kubelet[2569]: E0213 08:09:24.732263 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:09:24.732429 kubelet[2569]: E0213 08:09:24.732289 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:09:24.732429 kubelet[2569]: E0213 08:09:24.732312 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:24.732429 kubelet[2569]: E0213 08:09:24.732328 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:09:26.707072 env[1458]: time="2024-02-13T08:09:26.706946956Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:09:26.724032 env[1458]: time="2024-02-13T08:09:26.723965607Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:26.724206 kubelet[2569]: E0213 08:09:26.724166 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:09:26.724206 kubelet[2569]: E0213 08:09:26.724192 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:09:26.724400 kubelet[2569]: E0213 08:09:26.724216 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:26.724400 kubelet[2569]: E0213 08:09:26.724236 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:09:27.351360 systemd[1]: Started sshd@23-145.40.90.207:22-139.178.68.195:57968.service. Feb 13 08:09:27.350000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.90.207:22-139.178.68.195:57968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:27.378452 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:09:27.378491 kernel: audit: type=1130 audit(1707811767.350:1456): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.90.207:22-139.178.68.195:57968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:27.495000 audit[11153]: USER_ACCT pid=11153 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.495955 sshd[11153]: Accepted publickey for core from 139.178.68.195 port 57968 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:27.497013 sshd[11153]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:27.499360 systemd-logind[1446]: New session 23 of user core. Feb 13 08:09:27.499889 systemd[1]: Started session-23.scope. Feb 13 08:09:27.579038 sshd[11153]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:27.580442 systemd[1]: sshd@23-145.40.90.207:22-139.178.68.195:57968.service: Deactivated successfully. Feb 13 08:09:27.580887 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 08:09:27.581272 systemd-logind[1446]: Session 23 logged out. Waiting for processes to exit. Feb 13 08:09:27.581677 systemd-logind[1446]: Removed session 23. Feb 13 08:09:27.496000 audit[11153]: CRED_ACQ pid=11153 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.678613 kernel: audit: type=1101 audit(1707811767.495:1457): pid=11153 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.678654 kernel: audit: type=1103 audit(1707811767.496:1458): pid=11153 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.678672 kernel: audit: type=1006 audit(1707811767.496:1459): pid=11153 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Feb 13 08:09:27.737182 kernel: audit: type=1300 audit(1707811767.496:1459): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff814d5580 a2=3 a3=0 items=0 ppid=1 pid=11153 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:27.496000 audit[11153]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff814d5580 a2=3 a3=0 items=0 ppid=1 pid=11153 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:27.829093 kernel: audit: type=1327 audit(1707811767.496:1459): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:27.496000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:27.859565 kernel: audit: type=1105 audit(1707811767.501:1460): pid=11153 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.501000 audit[11153]: USER_START pid=11153 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.953979 kernel: audit: type=1103 audit(1707811767.502:1461): pid=11155 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.502000 audit[11155]: CRED_ACQ pid=11155 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:28.043105 kernel: audit: type=1106 audit(1707811767.579:1462): pid=11153 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.579000 audit[11153]: USER_END pid=11153 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.579000 audit[11153]: CRED_DISP pid=11153 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:28.227913 kernel: audit: type=1104 audit(1707811767.579:1463): pid=11153 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:27.579000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.90.207:22-139.178.68.195:57968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:29.705223 env[1458]: time="2024-02-13T08:09:29.705165489Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:09:29.717815 env[1458]: time="2024-02-13T08:09:29.717746180Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:29.717969 kubelet[2569]: E0213 08:09:29.717929 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:09:29.717969 kubelet[2569]: E0213 08:09:29.717957 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:09:29.718186 kubelet[2569]: E0213 08:09:29.717982 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:29.718186 kubelet[2569]: E0213 08:09:29.718003 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:09:30.706384 env[1458]: time="2024-02-13T08:09:30.706258732Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:09:30.759038 env[1458]: time="2024-02-13T08:09:30.758952215Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:30.759272 kubelet[2569]: E0213 08:09:30.759219 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:09:30.759272 kubelet[2569]: E0213 08:09:30.759261 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:09:30.759651 kubelet[2569]: E0213 08:09:30.759307 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:30.759651 kubelet[2569]: E0213 08:09:30.759341 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:09:32.590609 systemd[1]: Started sshd@24-145.40.90.207:22-139.178.68.195:57982.service. Feb 13 08:09:32.590000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.90.207:22-139.178.68.195:57982 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:32.617964 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:09:32.618047 kernel: audit: type=1130 audit(1707811772.590:1465): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.90.207:22-139.178.68.195:57982 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:32.735000 audit[11237]: USER_ACCT pid=11237 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.736249 sshd[11237]: Accepted publickey for core from 139.178.68.195 port 57982 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:32.738382 sshd[11237]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:32.743830 systemd-logind[1446]: New session 24 of user core. Feb 13 08:09:32.745521 systemd[1]: Started session-24.scope. Feb 13 08:09:32.737000 audit[11237]: CRED_ACQ pid=11237 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.831534 sshd[11237]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:32.833108 systemd[1]: sshd@24-145.40.90.207:22-139.178.68.195:57982.service: Deactivated successfully. Feb 13 08:09:32.833551 systemd[1]: session-24.scope: Deactivated successfully. Feb 13 08:09:32.834084 systemd-logind[1446]: Session 24 logged out. Waiting for processes to exit. Feb 13 08:09:32.834570 systemd-logind[1446]: Removed session 24. Feb 13 08:09:32.919196 kernel: audit: type=1101 audit(1707811772.735:1466): pid=11237 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.919254 kernel: audit: type=1103 audit(1707811772.737:1467): pid=11237 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.919273 kernel: audit: type=1006 audit(1707811772.737:1468): pid=11237 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Feb 13 08:09:32.977866 kernel: audit: type=1300 audit(1707811772.737:1468): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcaab1d910 a2=3 a3=0 items=0 ppid=1 pid=11237 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:32.737000 audit[11237]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcaab1d910 a2=3 a3=0 items=0 ppid=1 pid=11237 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:33.069954 kernel: audit: type=1327 audit(1707811772.737:1468): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:32.737000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:33.100433 kernel: audit: type=1105 audit(1707811772.751:1469): pid=11237 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.751000 audit[11237]: USER_START pid=11237 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:33.194953 kernel: audit: type=1103 audit(1707811772.753:1470): pid=11239 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.753000 audit[11239]: CRED_ACQ pid=11239 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:33.284309 kernel: audit: type=1106 audit(1707811772.831:1471): pid=11237 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.831000 audit[11237]: USER_END pid=11237 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:33.379899 kernel: audit: type=1104 audit(1707811772.831:1472): pid=11237 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.831000 audit[11237]: CRED_DISP pid=11237 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:32.832000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.90.207:22-139.178.68.195:57982 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:37.706931 env[1458]: time="2024-02-13T08:09:37.706827370Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:09:37.736375 env[1458]: time="2024-02-13T08:09:37.736278458Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:37.736521 kubelet[2569]: E0213 08:09:37.736511 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:09:37.736742 kubelet[2569]: E0213 08:09:37.736536 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:09:37.736742 kubelet[2569]: E0213 08:09:37.736558 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:37.736742 kubelet[2569]: E0213 08:09:37.736576 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:09:37.842219 systemd[1]: Started sshd@25-145.40.90.207:22-139.178.68.195:33812.service. Feb 13 08:09:37.841000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.90.207:22-139.178.68.195:33812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:37.869580 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:09:37.869613 kernel: audit: type=1130 audit(1707811777.841:1474): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.90.207:22-139.178.68.195:33812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:37.986000 audit[11296]: USER_ACCT pid=11296 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:37.987506 sshd[11296]: Accepted publickey for core from 139.178.68.195 port 33812 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:37.988948 sshd[11296]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:37.991322 systemd-logind[1446]: New session 25 of user core. Feb 13 08:09:37.991833 systemd[1]: Started session-25.scope. Feb 13 08:09:38.072252 sshd[11296]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:38.073720 systemd[1]: sshd@25-145.40.90.207:22-139.178.68.195:33812.service: Deactivated successfully. Feb 13 08:09:38.074140 systemd[1]: session-25.scope: Deactivated successfully. Feb 13 08:09:38.074479 systemd-logind[1446]: Session 25 logged out. Waiting for processes to exit. Feb 13 08:09:38.075139 systemd-logind[1446]: Removed session 25. Feb 13 08:09:37.988000 audit[11296]: CRED_ACQ pid=11296 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.169461 kernel: audit: type=1101 audit(1707811777.986:1475): pid=11296 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.169507 kernel: audit: type=1103 audit(1707811777.988:1476): pid=11296 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.169526 kernel: audit: type=1006 audit(1707811777.988:1477): pid=11296 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Feb 13 08:09:38.228111 kernel: audit: type=1300 audit(1707811777.988:1477): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee56f13f0 a2=3 a3=0 items=0 ppid=1 pid=11296 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:37.988000 audit[11296]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee56f13f0 a2=3 a3=0 items=0 ppid=1 pid=11296 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:38.320172 kernel: audit: type=1327 audit(1707811777.988:1477): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:37.988000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:38.350700 kernel: audit: type=1105 audit(1707811777.993:1478): pid=11296 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:37.993000 audit[11296]: USER_START pid=11296 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.445267 kernel: audit: type=1103 audit(1707811777.993:1479): pid=11298 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:37.993000 audit[11298]: CRED_ACQ pid=11298 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.534472 kernel: audit: type=1106 audit(1707811778.072:1480): pid=11296 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.072000 audit[11296]: USER_END pid=11296 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.629997 kernel: audit: type=1104 audit(1707811778.072:1481): pid=11296 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.072000 audit[11296]: CRED_DISP pid=11296 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:38.073000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.90.207:22-139.178.68.195:33812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:41.706948 env[1458]: time="2024-02-13T08:09:41.706852095Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:09:41.708160 env[1458]: time="2024-02-13T08:09:41.707125752Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:09:41.722853 env[1458]: time="2024-02-13T08:09:41.722788510Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:41.722853 env[1458]: time="2024-02-13T08:09:41.722801986Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:41.723041 kubelet[2569]: E0213 08:09:41.722994 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:09:41.723041 kubelet[2569]: E0213 08:09:41.723005 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:09:41.723041 kubelet[2569]: E0213 08:09:41.723022 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:09:41.723041 kubelet[2569]: E0213 08:09:41.723022 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:09:41.723041 kubelet[2569]: E0213 08:09:41.723045 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:41.723342 kubelet[2569]: E0213 08:09:41.723046 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:41.723342 kubelet[2569]: E0213 08:09:41.723068 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:09:41.723342 kubelet[2569]: E0213 08:09:41.723068 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:09:43.081621 systemd[1]: Started sshd@26-145.40.90.207:22-139.178.68.195:33814.service. Feb 13 08:09:43.081000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-145.40.90.207:22-139.178.68.195:33814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:43.108136 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:09:43.108191 kernel: audit: type=1130 audit(1707811783.081:1483): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-145.40.90.207:22-139.178.68.195:33814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:43.223000 audit[11381]: USER_ACCT pid=11381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.224736 sshd[11381]: Accepted publickey for core from 139.178.68.195 port 33814 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:43.227957 sshd[11381]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:43.230184 systemd-logind[1446]: New session 26 of user core. Feb 13 08:09:43.230867 systemd[1]: Started session-26.scope. Feb 13 08:09:43.311057 sshd[11381]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:43.312411 systemd[1]: sshd@26-145.40.90.207:22-139.178.68.195:33814.service: Deactivated successfully. Feb 13 08:09:43.312894 systemd[1]: session-26.scope: Deactivated successfully. Feb 13 08:09:43.313252 systemd-logind[1446]: Session 26 logged out. Waiting for processes to exit. Feb 13 08:09:43.313635 systemd-logind[1446]: Removed session 26. Feb 13 08:09:43.227000 audit[11381]: CRED_ACQ pid=11381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.408897 kernel: audit: type=1101 audit(1707811783.223:1484): pid=11381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.408938 kernel: audit: type=1103 audit(1707811783.227:1485): pid=11381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.408958 kernel: audit: type=1006 audit(1707811783.227:1486): pid=11381 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Feb 13 08:09:43.467546 kernel: audit: type=1300 audit(1707811783.227:1486): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff82cc59c0 a2=3 a3=0 items=0 ppid=1 pid=11381 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:43.227000 audit[11381]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff82cc59c0 a2=3 a3=0 items=0 ppid=1 pid=11381 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:43.559717 kernel: audit: type=1327 audit(1707811783.227:1486): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:43.227000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:43.590191 kernel: audit: type=1105 audit(1707811783.232:1487): pid=11381 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.232000 audit[11381]: USER_START pid=11381 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.684815 kernel: audit: type=1103 audit(1707811783.233:1488): pid=11383 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.233000 audit[11383]: CRED_ACQ pid=11383 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.705411 env[1458]: time="2024-02-13T08:09:43.705346447Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:09:43.717186 env[1458]: time="2024-02-13T08:09:43.717148182Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:43.717367 kubelet[2569]: E0213 08:09:43.717355 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:09:43.717537 kubelet[2569]: E0213 08:09:43.717382 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:09:43.717537 kubelet[2569]: E0213 08:09:43.717407 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:43.717537 kubelet[2569]: E0213 08:09:43.717424 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:09:43.774091 kernel: audit: type=1106 audit(1707811783.311:1489): pid=11381 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.311000 audit[11381]: USER_END pid=11381 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.311000 audit[11381]: CRED_DISP pid=11381 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.959003 kernel: audit: type=1104 audit(1707811783.311:1490): pid=11381 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:43.311000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-145.40.90.207:22-139.178.68.195:33814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:46.120000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:46.120000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001dae810 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:09:46.120000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:09:46.120000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:46.120000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0029ecec0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:09:46.120000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:09:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c005dc2480 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:09:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:09:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=60 a1=c0086937d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:09:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:09:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=60 a1=c009168f20 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:09:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:09:46.933000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:46.933000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c009d7a0c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:09:46.933000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:09:46.933000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:46.933000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=60 a1=c00e2ca5d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:09:46.933000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:09:46.934000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:46.934000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0099fd560 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:09:46.934000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:09:48.322352 systemd[1]: Started sshd@27-145.40.90.207:22-139.178.68.195:37854.service. Feb 13 08:09:48.322000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-145.40.90.207:22-139.178.68.195:37854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:48.364339 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:09:48.364446 kernel: audit: type=1130 audit(1707811788.322:1500): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-145.40.90.207:22-139.178.68.195:37854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:48.479000 audit[11438]: USER_ACCT pid=11438 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.479980 sshd[11438]: Accepted publickey for core from 139.178.68.195 port 37854 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:48.481535 sshd[11438]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:48.484011 systemd-logind[1446]: New session 27 of user core. Feb 13 08:09:48.484612 systemd[1]: Started session-27.scope. Feb 13 08:09:48.563263 sshd[11438]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:48.564753 systemd[1]: sshd@27-145.40.90.207:22-139.178.68.195:37854.service: Deactivated successfully. Feb 13 08:09:48.565192 systemd[1]: session-27.scope: Deactivated successfully. Feb 13 08:09:48.565509 systemd-logind[1446]: Session 27 logged out. Waiting for processes to exit. Feb 13 08:09:48.566139 systemd-logind[1446]: Removed session 27. Feb 13 08:09:48.480000 audit[11438]: CRED_ACQ pid=11438 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.662024 kernel: audit: type=1101 audit(1707811788.479:1501): pid=11438 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.662117 kernel: audit: type=1103 audit(1707811788.480:1502): pid=11438 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.662137 kernel: audit: type=1006 audit(1707811788.480:1503): pid=11438 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Feb 13 08:09:48.705639 env[1458]: time="2024-02-13T08:09:48.705573196Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:09:48.480000 audit[11438]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee19cf4c0 a2=3 a3=0 items=0 ppid=1 pid=11438 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:48.721954 env[1458]: time="2024-02-13T08:09:48.721880122Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:48.722101 kubelet[2569]: E0213 08:09:48.722087 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:09:48.722289 kubelet[2569]: E0213 08:09:48.722118 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:09:48.722289 kubelet[2569]: E0213 08:09:48.722153 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:48.722289 kubelet[2569]: E0213 08:09:48.722179 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:09:48.812813 kernel: audit: type=1300 audit(1707811788.480:1503): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee19cf4c0 a2=3 a3=0 items=0 ppid=1 pid=11438 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:48.812849 kernel: audit: type=1327 audit(1707811788.480:1503): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:48.480000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:48.843315 kernel: audit: type=1105 audit(1707811788.486:1504): pid=11438 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.486000 audit[11438]: USER_START pid=11438 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.937826 kernel: audit: type=1103 audit(1707811788.486:1505): pid=11440 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.486000 audit[11440]: CRED_ACQ pid=11440 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:49.027064 kernel: audit: type=1106 audit(1707811788.563:1506): pid=11438 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.563000 audit[11438]: USER_END pid=11438 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:49.122592 kernel: audit: type=1104 audit(1707811788.563:1507): pid=11438 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.563000 audit[11438]: CRED_DISP pid=11438 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:48.564000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-145.40.90.207:22-139.178.68.195:37854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001a3d180 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:09:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:09:50.876000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:50.876000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002924620 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:09:50.876000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:09:50.876000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:50.876000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001a3d1a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:09:50.876000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:09:50.880000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:09:50.880000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b226e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:09:50.880000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:09:52.706447 env[1458]: time="2024-02-13T08:09:52.706356657Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:09:52.722149 env[1458]: time="2024-02-13T08:09:52.722088560Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:52.722329 kubelet[2569]: E0213 08:09:52.722282 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:09:52.722329 kubelet[2569]: E0213 08:09:52.722306 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:09:52.722329 kubelet[2569]: E0213 08:09:52.722329 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:52.722549 kubelet[2569]: E0213 08:09:52.722346 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:09:53.573254 systemd[1]: Started sshd@28-145.40.90.207:22-139.178.68.195:37856.service. Feb 13 08:09:53.572000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.90.207:22-139.178.68.195:37856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:53.600362 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:09:53.600431 kernel: audit: type=1130 audit(1707811793.572:1513): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.90.207:22-139.178.68.195:37856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:53.716000 audit[11521]: USER_ACCT pid=11521 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.717426 sshd[11521]: Accepted publickey for core from 139.178.68.195 port 37856 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:53.718961 sshd[11521]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:53.721307 systemd-logind[1446]: New session 28 of user core. Feb 13 08:09:53.721841 systemd[1]: Started session-28.scope. Feb 13 08:09:53.802941 sshd[11521]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:53.804447 systemd[1]: sshd@28-145.40.90.207:22-139.178.68.195:37856.service: Deactivated successfully. Feb 13 08:09:53.805022 systemd[1]: session-28.scope: Deactivated successfully. Feb 13 08:09:53.805440 systemd-logind[1446]: Session 28 logged out. Waiting for processes to exit. Feb 13 08:09:53.805977 systemd-logind[1446]: Removed session 28. Feb 13 08:09:53.718000 audit[11521]: CRED_ACQ pid=11521 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.902170 kernel: audit: type=1101 audit(1707811793.716:1514): pid=11521 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.902208 kernel: audit: type=1103 audit(1707811793.718:1515): pid=11521 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.902227 kernel: audit: type=1006 audit(1707811793.718:1516): pid=11521 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Feb 13 08:09:53.960796 kernel: audit: type=1300 audit(1707811793.718:1516): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd88e45a70 a2=3 a3=0 items=0 ppid=1 pid=11521 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:53.718000 audit[11521]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd88e45a70 a2=3 a3=0 items=0 ppid=1 pid=11521 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:53.718000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:54.083220 kernel: audit: type=1327 audit(1707811793.718:1516): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:54.083250 kernel: audit: type=1105 audit(1707811793.723:1517): pid=11521 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.723000 audit[11521]: USER_START pid=11521 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:54.177711 kernel: audit: type=1103 audit(1707811793.723:1518): pid=11523 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.723000 audit[11523]: CRED_ACQ pid=11523 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:54.266844 kernel: audit: type=1106 audit(1707811793.802:1519): pid=11521 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.802000 audit[11521]: USER_END pid=11521 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:54.362316 kernel: audit: type=1104 audit(1707811793.803:1520): pid=11521 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.803000 audit[11521]: CRED_DISP pid=11521 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:53.803000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.90.207:22-139.178.68.195:37856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:54.706042 env[1458]: time="2024-02-13T08:09:54.705960769Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:09:54.732781 env[1458]: time="2024-02-13T08:09:54.732746715Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:54.732998 kubelet[2569]: E0213 08:09:54.732987 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:09:54.733168 kubelet[2569]: E0213 08:09:54.733013 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:09:54.733168 kubelet[2569]: E0213 08:09:54.733035 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:54.733168 kubelet[2569]: E0213 08:09:54.733054 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:09:55.706864 env[1458]: time="2024-02-13T08:09:55.706782841Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:09:55.723512 env[1458]: time="2024-02-13T08:09:55.723444595Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:09:55.723697 kubelet[2569]: E0213 08:09:55.723629 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:09:55.723697 kubelet[2569]: E0213 08:09:55.723667 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:09:55.723697 kubelet[2569]: E0213 08:09:55.723690 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:09:55.723822 kubelet[2569]: E0213 08:09:55.723708 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:09:58.811815 systemd[1]: Started sshd@29-145.40.90.207:22-139.178.68.195:47718.service. Feb 13 08:09:58.811000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-145.40.90.207:22-139.178.68.195:47718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:58.838825 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:09:58.838911 kernel: audit: type=1130 audit(1707811798.811:1522): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-145.40.90.207:22-139.178.68.195:47718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:09:58.954000 audit[11603]: USER_ACCT pid=11603 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:58.954912 sshd[11603]: Accepted publickey for core from 139.178.68.195 port 47718 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:09:58.956917 sshd[11603]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:09:58.959218 systemd-logind[1446]: New session 29 of user core. Feb 13 08:09:58.959818 systemd[1]: Started session-29.scope. Feb 13 08:09:59.040040 sshd[11603]: pam_unix(sshd:session): session closed for user core Feb 13 08:09:59.041369 systemd[1]: sshd@29-145.40.90.207:22-139.178.68.195:47718.service: Deactivated successfully. Feb 13 08:09:59.041873 systemd[1]: session-29.scope: Deactivated successfully. Feb 13 08:09:59.042319 systemd-logind[1446]: Session 29 logged out. Waiting for processes to exit. Feb 13 08:09:59.042841 systemd-logind[1446]: Removed session 29. Feb 13 08:09:58.956000 audit[11603]: CRED_ACQ pid=11603 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.139345 kernel: audit: type=1101 audit(1707811798.954:1523): pid=11603 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.139390 kernel: audit: type=1103 audit(1707811798.956:1524): pid=11603 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.139406 kernel: audit: type=1006 audit(1707811798.956:1525): pid=11603 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Feb 13 08:09:59.197983 kernel: audit: type=1300 audit(1707811798.956:1525): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff61880e80 a2=3 a3=0 items=0 ppid=1 pid=11603 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:58.956000 audit[11603]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff61880e80 a2=3 a3=0 items=0 ppid=1 pid=11603 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:09:59.290003 kernel: audit: type=1327 audit(1707811798.956:1525): proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:58.956000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:09:59.320468 kernel: audit: type=1105 audit(1707811798.961:1526): pid=11603 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:58.961000 audit[11603]: USER_START pid=11603 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.415025 kernel: audit: type=1103 audit(1707811798.961:1527): pid=11605 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:58.961000 audit[11605]: CRED_ACQ pid=11605 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.504224 kernel: audit: type=1106 audit(1707811799.039:1528): pid=11603 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.039000 audit[11603]: USER_END pid=11603 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.040000 audit[11603]: CRED_DISP pid=11603 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.689001 kernel: audit: type=1104 audit(1707811799.040:1529): pid=11603 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:09:59.040000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-145.40.90.207:22-139.178.68.195:47718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:03.707055 env[1458]: time="2024-02-13T08:10:03.706925334Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:10:03.733158 env[1458]: time="2024-02-13T08:10:03.733102564Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:03.733251 kubelet[2569]: E0213 08:10:03.733229 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:10:03.733251 kubelet[2569]: E0213 08:10:03.733251 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:10:03.733443 kubelet[2569]: E0213 08:10:03.733275 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:03.733443 kubelet[2569]: E0213 08:10:03.733294 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:10:04.048789 systemd[1]: Started sshd@30-145.40.90.207:22-139.178.68.195:47724.service. Feb 13 08:10:04.048000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.90.207:22-139.178.68.195:47724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:04.075681 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:10:04.075786 kernel: audit: type=1130 audit(1707811804.048:1531): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.90.207:22-139.178.68.195:47724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:04.193000 audit[11657]: USER_ACCT pid=11657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.194220 sshd[11657]: Accepted publickey for core from 139.178.68.195 port 47724 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:04.198931 sshd[11657]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:04.208388 systemd-logind[1446]: New session 30 of user core. Feb 13 08:10:04.209036 systemd[1]: Started session-30.scope. Feb 13 08:10:04.197000 audit[11657]: CRED_ACQ pid=11657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.289489 sshd[11657]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:04.291044 systemd[1]: sshd@30-145.40.90.207:22-139.178.68.195:47724.service: Deactivated successfully. Feb 13 08:10:04.291477 systemd[1]: session-30.scope: Deactivated successfully. Feb 13 08:10:04.291844 systemd-logind[1446]: Session 30 logged out. Waiting for processes to exit. Feb 13 08:10:04.292328 systemd-logind[1446]: Removed session 30. Feb 13 08:10:04.376562 kernel: audit: type=1101 audit(1707811804.193:1532): pid=11657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.376613 kernel: audit: type=1103 audit(1707811804.197:1533): pid=11657 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.376638 kernel: audit: type=1006 audit(1707811804.197:1534): pid=11657 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=30 res=1 Feb 13 08:10:04.435120 kernel: audit: type=1300 audit(1707811804.197:1534): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe58b69f40 a2=3 a3=0 items=0 ppid=1 pid=11657 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:04.197000 audit[11657]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe58b69f40 a2=3 a3=0 items=0 ppid=1 pid=11657 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:04.527088 kernel: audit: type=1327 audit(1707811804.197:1534): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:04.197000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:04.557550 kernel: audit: type=1105 audit(1707811804.210:1535): pid=11657 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.210000 audit[11657]: USER_START pid=11657 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.651991 kernel: audit: type=1103 audit(1707811804.211:1536): pid=11659 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.211000 audit[11659]: CRED_ACQ pid=11659 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.741240 kernel: audit: type=1106 audit(1707811804.289:1537): pid=11657 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.289000 audit[11657]: USER_END pid=11657 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.836702 kernel: audit: type=1104 audit(1707811804.289:1538): pid=11657 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.289000 audit[11657]: CRED_DISP pid=11657 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:04.290000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.90.207:22-139.178.68.195:47724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:05.707129 env[1458]: time="2024-02-13T08:10:05.707018169Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:10:05.758374 env[1458]: time="2024-02-13T08:10:05.758281878Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:05.758564 kubelet[2569]: E0213 08:10:05.758537 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:10:05.758920 kubelet[2569]: E0213 08:10:05.758589 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:10:05.758920 kubelet[2569]: E0213 08:10:05.758669 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:05.758920 kubelet[2569]: E0213 08:10:05.758706 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:10:06.706080 env[1458]: time="2024-02-13T08:10:06.705986432Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:10:06.758759 env[1458]: time="2024-02-13T08:10:06.758688300Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:06.759192 kubelet[2569]: E0213 08:10:06.758946 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:10:06.759192 kubelet[2569]: E0213 08:10:06.758993 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:10:06.759192 kubelet[2569]: E0213 08:10:06.759045 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:06.759192 kubelet[2569]: E0213 08:10:06.759084 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:10:08.706441 env[1458]: time="2024-02-13T08:10:08.706350977Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:10:08.735555 env[1458]: time="2024-02-13T08:10:08.735496636Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:08.735769 kubelet[2569]: E0213 08:10:08.735731 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:10:08.735769 kubelet[2569]: E0213 08:10:08.735755 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:10:08.735953 kubelet[2569]: E0213 08:10:08.735776 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:08.735953 kubelet[2569]: E0213 08:10:08.735795 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:10:09.298887 systemd[1]: Started sshd@31-145.40.90.207:22-139.178.68.195:51270.service. Feb 13 08:10:09.298000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.90.207:22-139.178.68.195:51270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:09.325867 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:10:09.325949 kernel: audit: type=1130 audit(1707811809.298:1540): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.90.207:22-139.178.68.195:51270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:09.442000 audit[11773]: USER_ACCT pid=11773 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.442948 sshd[11773]: Accepted publickey for core from 139.178.68.195 port 51270 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:09.444911 sshd[11773]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:09.447284 systemd-logind[1446]: New session 31 of user core. Feb 13 08:10:09.447832 systemd[1]: Started session-31.scope. Feb 13 08:10:09.528364 sshd[11773]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:09.529876 systemd[1]: sshd@31-145.40.90.207:22-139.178.68.195:51270.service: Deactivated successfully. Feb 13 08:10:09.530327 systemd[1]: session-31.scope: Deactivated successfully. Feb 13 08:10:09.530620 systemd-logind[1446]: Session 31 logged out. Waiting for processes to exit. Feb 13 08:10:09.531368 systemd-logind[1446]: Removed session 31. Feb 13 08:10:09.444000 audit[11773]: CRED_ACQ pid=11773 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.626782 kernel: audit: type=1101 audit(1707811809.442:1541): pid=11773 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.626823 kernel: audit: type=1103 audit(1707811809.444:1542): pid=11773 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.626843 kernel: audit: type=1006 audit(1707811809.444:1543): pid=11773 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=31 res=1 Feb 13 08:10:09.685361 kernel: audit: type=1300 audit(1707811809.444:1543): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd300e8490 a2=3 a3=0 items=0 ppid=1 pid=11773 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:09.444000 audit[11773]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd300e8490 a2=3 a3=0 items=0 ppid=1 pid=11773 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:09.777269 kernel: audit: type=1327 audit(1707811809.444:1543): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:09.444000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:09.807695 kernel: audit: type=1105 audit(1707811809.449:1544): pid=11773 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.449000 audit[11773]: USER_START pid=11773 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.902208 kernel: audit: type=1103 audit(1707811809.450:1545): pid=11775 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.450000 audit[11775]: CRED_ACQ pid=11775 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.991370 kernel: audit: type=1106 audit(1707811809.528:1546): pid=11773 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.528000 audit[11773]: USER_END pid=11773 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.528000 audit[11773]: CRED_DISP pid=11773 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:10.176091 kernel: audit: type=1104 audit(1707811809.528:1547): pid=11773 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:09.529000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.90.207:22-139.178.68.195:51270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:14.531884 systemd[1]: Started sshd@32-145.40.90.207:22-139.178.68.195:51274.service. Feb 13 08:10:14.530000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.90.207:22-139.178.68.195:51274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:14.558846 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:10:14.558922 kernel: audit: type=1130 audit(1707811814.530:1549): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.90.207:22-139.178.68.195:51274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:14.673000 audit[11798]: USER_ACCT pid=11798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.675570 sshd[11798]: Accepted publickey for core from 139.178.68.195 port 51274 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:14.678026 sshd[11798]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:14.685372 systemd-logind[1446]: New session 32 of user core. Feb 13 08:10:14.687157 systemd[1]: Started session-32.scope. Feb 13 08:10:14.675000 audit[11798]: CRED_ACQ pid=11798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.786001 sshd[11798]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:14.787394 systemd[1]: sshd@32-145.40.90.207:22-139.178.68.195:51274.service: Deactivated successfully. Feb 13 08:10:14.787823 systemd[1]: session-32.scope: Deactivated successfully. Feb 13 08:10:14.788249 systemd-logind[1446]: Session 32 logged out. Waiting for processes to exit. Feb 13 08:10:14.788777 systemd-logind[1446]: Removed session 32. Feb 13 08:10:14.857095 kernel: audit: type=1101 audit(1707811814.673:1550): pid=11798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.857131 kernel: audit: type=1103 audit(1707811814.675:1551): pid=11798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.857155 kernel: audit: type=1006 audit(1707811814.675:1552): pid=11798 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=32 res=1 Feb 13 08:10:14.915716 kernel: audit: type=1300 audit(1707811814.675:1552): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffea97fcf20 a2=3 a3=0 items=0 ppid=1 pid=11798 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:14.675000 audit[11798]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffea97fcf20 a2=3 a3=0 items=0 ppid=1 pid=11798 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:15.007658 kernel: audit: type=1327 audit(1707811814.675:1552): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:14.675000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:15.038231 kernel: audit: type=1105 audit(1707811814.694:1553): pid=11798 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.694000 audit[11798]: USER_START pid=11798 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:15.132716 kernel: audit: type=1103 audit(1707811814.696:1554): pid=11800 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.696000 audit[11800]: CRED_ACQ pid=11800 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:15.221904 kernel: audit: type=1106 audit(1707811814.785:1555): pid=11798 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.785000 audit[11798]: USER_END pid=11798 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:15.317402 kernel: audit: type=1104 audit(1707811814.785:1556): pid=11798 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.785000 audit[11798]: CRED_DISP pid=11798 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:14.785000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.90.207:22-139.178.68.195:51274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:17.706034 env[1458]: time="2024-02-13T08:10:17.705978510Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:10:17.721037 env[1458]: time="2024-02-13T08:10:17.720997050Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:17.721223 kubelet[2569]: E0213 08:10:17.721210 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:10:17.721480 kubelet[2569]: E0213 08:10:17.721243 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:10:17.721480 kubelet[2569]: E0213 08:10:17.721285 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:17.721480 kubelet[2569]: E0213 08:10:17.721321 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:10:18.706411 env[1458]: time="2024-02-13T08:10:18.706311846Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:10:18.749181 env[1458]: time="2024-02-13T08:10:18.749080134Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:18.749431 kubelet[2569]: E0213 08:10:18.749383 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:10:18.749431 kubelet[2569]: E0213 08:10:18.749432 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:10:18.749864 kubelet[2569]: E0213 08:10:18.749482 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:18.749864 kubelet[2569]: E0213 08:10:18.749526 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:10:19.706917 env[1458]: time="2024-02-13T08:10:19.706780065Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:10:19.759547 env[1458]: time="2024-02-13T08:10:19.759451810Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:19.759798 kubelet[2569]: E0213 08:10:19.759710 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:10:19.759798 kubelet[2569]: E0213 08:10:19.759769 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:10:19.760294 kubelet[2569]: E0213 08:10:19.759850 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:19.760294 kubelet[2569]: E0213 08:10:19.759929 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:10:19.792574 systemd[1]: Started sshd@33-145.40.90.207:22-139.178.68.195:54782.service. Feb 13 08:10:19.791000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.90.207:22-139.178.68.195:54782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:19.819640 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:10:19.819689 kernel: audit: type=1130 audit(1707811819.791:1558): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.90.207:22-139.178.68.195:54782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:19.936000 audit[11915]: USER_ACCT pid=11915 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:19.937806 sshd[11915]: Accepted publickey for core from 139.178.68.195 port 54782 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:19.938907 sshd[11915]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:19.941279 systemd-logind[1446]: New session 33 of user core. Feb 13 08:10:19.941804 systemd[1]: Started session-33.scope. Feb 13 08:10:20.020463 sshd[11915]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:20.021811 systemd[1]: sshd@33-145.40.90.207:22-139.178.68.195:54782.service: Deactivated successfully. Feb 13 08:10:20.022245 systemd[1]: session-33.scope: Deactivated successfully. Feb 13 08:10:20.022535 systemd-logind[1446]: Session 33 logged out. Waiting for processes to exit. Feb 13 08:10:20.022967 systemd-logind[1446]: Removed session 33. Feb 13 08:10:19.937000 audit[11915]: CRED_ACQ pid=11915 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.119608 kernel: audit: type=1101 audit(1707811819.936:1559): pid=11915 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.119674 kernel: audit: type=1103 audit(1707811819.937:1560): pid=11915 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.119694 kernel: audit: type=1006 audit(1707811819.937:1561): pid=11915 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=33 res=1 Feb 13 08:10:19.937000 audit[11915]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff65972f20 a2=3 a3=0 items=0 ppid=1 pid=11915 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:20.270144 kernel: audit: type=1300 audit(1707811819.937:1561): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff65972f20 a2=3 a3=0 items=0 ppid=1 pid=11915 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:20.270175 kernel: audit: type=1327 audit(1707811819.937:1561): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:19.937000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:20.300565 kernel: audit: type=1105 audit(1707811819.942:1562): pid=11915 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:19.942000 audit[11915]: USER_START pid=11915 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.394975 kernel: audit: type=1103 audit(1707811819.943:1563): pid=11917 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:19.943000 audit[11917]: CRED_ACQ pid=11917 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.019000 audit[11915]: USER_END pid=11915 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.579587 kernel: audit: type=1106 audit(1707811820.019:1564): pid=11915 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.579621 kernel: audit: type=1104 audit(1707811820.019:1565): pid=11915 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.019000 audit[11915]: CRED_DISP pid=11915 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:20.020000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.90.207:22-139.178.68.195:54782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:21.706779 env[1458]: time="2024-02-13T08:10:21.706625761Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:10:21.733786 env[1458]: time="2024-02-13T08:10:21.733723613Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:21.733950 kubelet[2569]: E0213 08:10:21.733921 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:10:21.733950 kubelet[2569]: E0213 08:10:21.733946 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:10:21.734140 kubelet[2569]: E0213 08:10:21.733967 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:21.734140 kubelet[2569]: E0213 08:10:21.733986 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:10:25.030368 systemd[1]: Started sshd@34-145.40.90.207:22-139.178.68.195:54798.service. Feb 13 08:10:25.028000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.90.207:22-139.178.68.195:54798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:25.057334 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:10:25.057387 kernel: audit: type=1130 audit(1707811825.028:1567): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.90.207:22-139.178.68.195:54798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:25.173000 audit[11967]: USER_ACCT pid=11967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.175451 sshd[11967]: Accepted publickey for core from 139.178.68.195 port 54798 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:25.176894 sshd[11967]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:25.179260 systemd-logind[1446]: New session 34 of user core. Feb 13 08:10:25.179809 systemd[1]: Started session-34.scope. Feb 13 08:10:25.259835 sshd[11967]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:25.261259 systemd[1]: sshd@34-145.40.90.207:22-139.178.68.195:54798.service: Deactivated successfully. Feb 13 08:10:25.261698 systemd[1]: session-34.scope: Deactivated successfully. Feb 13 08:10:25.262115 systemd-logind[1446]: Session 34 logged out. Waiting for processes to exit. Feb 13 08:10:25.262611 systemd-logind[1446]: Removed session 34. Feb 13 08:10:25.175000 audit[11967]: CRED_ACQ pid=11967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.357128 kernel: audit: type=1101 audit(1707811825.173:1568): pid=11967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.357166 kernel: audit: type=1103 audit(1707811825.175:1569): pid=11967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.357183 kernel: audit: type=1006 audit(1707811825.175:1570): pid=11967 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=34 res=1 Feb 13 08:10:25.415720 kernel: audit: type=1300 audit(1707811825.175:1570): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc527619a0 a2=3 a3=0 items=0 ppid=1 pid=11967 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:25.175000 audit[11967]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc527619a0 a2=3 a3=0 items=0 ppid=1 pid=11967 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:25.507678 kernel: audit: type=1327 audit(1707811825.175:1570): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:25.175000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:25.538115 kernel: audit: type=1105 audit(1707811825.180:1571): pid=11967 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.180000 audit[11967]: USER_START pid=11967 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.632565 kernel: audit: type=1103 audit(1707811825.180:1572): pid=11969 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.180000 audit[11969]: CRED_ACQ pid=11969 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.721759 kernel: audit: type=1106 audit(1707811825.258:1573): pid=11967 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.258000 audit[11967]: USER_END pid=11967 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.783734 systemd[1]: Started sshd@35-145.40.90.207:22-101.43.185.249:50204.service. Feb 13 08:10:25.817278 kernel: audit: type=1104 audit(1707811825.258:1574): pid=11967 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.258000 audit[11967]: CRED_DISP pid=11967 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:25.259000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.90.207:22-139.178.68.195:54798 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:25.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-145.40.90.207:22-101.43.185.249:50204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:28.706725 env[1458]: time="2024-02-13T08:10:28.706595435Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:10:28.757040 env[1458]: time="2024-02-13T08:10:28.756982566Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:28.757311 kubelet[2569]: E0213 08:10:28.757257 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:10:28.757311 kubelet[2569]: E0213 08:10:28.757301 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:10:28.757730 kubelet[2569]: E0213 08:10:28.757346 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:28.757730 kubelet[2569]: E0213 08:10:28.757381 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:10:29.539465 systemd[1]: Started sshd@36-145.40.90.207:22-43.153.220.201:42966.service. Feb 13 08:10:29.537000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-145.40.90.207:22-43.153.220.201:42966 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:30.270749 systemd[1]: Started sshd@37-145.40.90.207:22-139.178.68.195:47256.service. Feb 13 08:10:30.269000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.90.207:22-139.178.68.195:47256 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:30.298186 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:10:30.298244 kernel: audit: type=1130 audit(1707811830.269:1578): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.90.207:22-139.178.68.195:47256 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:30.413000 audit[12028]: USER_ACCT pid=12028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.415808 sshd[12028]: Accepted publickey for core from 139.178.68.195 port 47256 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:30.418319 sshd[12028]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:30.424135 systemd-logind[1446]: New session 35 of user core. Feb 13 08:10:30.425478 systemd[1]: Started session-35.scope. Feb 13 08:10:30.416000 audit[12028]: CRED_ACQ pid=12028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.594109 sshd[12025]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.220.201 user=root Feb 13 08:10:30.597450 kernel: audit: type=1101 audit(1707811830.413:1579): pid=12028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.597487 kernel: audit: type=1103 audit(1707811830.416:1580): pid=12028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.597505 kernel: audit: type=1006 audit(1707811830.416:1581): pid=12028 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=35 res=1 Feb 13 08:10:30.656036 kernel: audit: type=1300 audit(1707811830.416:1581): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeea301520 a2=3 a3=0 items=0 ppid=1 pid=12028 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:30.416000 audit[12028]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeea301520 a2=3 a3=0 items=0 ppid=1 pid=12028 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:30.748034 kernel: audit: type=1327 audit(1707811830.416:1581): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:30.416000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:30.748270 sshd[12028]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:30.749721 systemd[1]: sshd@37-145.40.90.207:22-139.178.68.195:47256.service: Deactivated successfully. Feb 13 08:10:30.750143 systemd[1]: session-35.scope: Deactivated successfully. Feb 13 08:10:30.750487 systemd-logind[1446]: Session 35 logged out. Waiting for processes to exit. Feb 13 08:10:30.751022 systemd-logind[1446]: Removed session 35. Feb 13 08:10:30.431000 audit[12028]: USER_START pid=12028 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.872946 kernel: audit: type=1105 audit(1707811830.431:1582): pid=12028 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.872981 kernel: audit: type=1103 audit(1707811830.432:1583): pid=12030 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.432000 audit[12030]: CRED_ACQ pid=12030 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.962141 kernel: audit: type=1100 audit(1707811830.592:1584): pid=12025 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.220.201 addr=43.153.220.201 terminal=ssh res=failed' Feb 13 08:10:30.592000 audit[12025]: USER_AUTH pid=12025 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.220.201 addr=43.153.220.201 terminal=ssh res=failed' Feb 13 08:10:31.050771 kernel: audit: type=1106 audit(1707811830.747:1585): pid=12028 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.747000 audit[12028]: USER_END pid=12028 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.747000 audit[12028]: CRED_DISP pid=12028 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:30.748000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.90.207:22-139.178.68.195:47256 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:32.454917 sshd[12025]: Failed password for root from 43.153.220.201 port 42966 ssh2 Feb 13 08:10:32.706562 env[1458]: time="2024-02-13T08:10:32.706340679Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:10:32.733480 env[1458]: time="2024-02-13T08:10:32.733444321Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:32.733718 kubelet[2569]: E0213 08:10:32.733676 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:10:32.733718 kubelet[2569]: E0213 08:10:32.733704 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:10:32.733916 kubelet[2569]: E0213 08:10:32.733726 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:32.733916 kubelet[2569]: E0213 08:10:32.733744 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:10:34.223414 sshd[12025]: Received disconnect from 43.153.220.201 port 42966:11: Bye Bye [preauth] Feb 13 08:10:34.223414 sshd[12025]: Disconnected from authenticating user root 43.153.220.201 port 42966 [preauth] Feb 13 08:10:34.224116 systemd[1]: sshd@36-145.40.90.207:22-43.153.220.201:42966.service: Deactivated successfully. Feb 13 08:10:34.222000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-145.40.90.207:22-43.153.220.201:42966 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:34.706527 env[1458]: time="2024-02-13T08:10:34.706402136Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:10:34.761021 env[1458]: time="2024-02-13T08:10:34.760926279Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:34.761198 kubelet[2569]: E0213 08:10:34.761176 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:10:34.761506 kubelet[2569]: E0213 08:10:34.761220 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:10:34.761506 kubelet[2569]: E0213 08:10:34.761266 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:34.761506 kubelet[2569]: E0213 08:10:34.761300 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:10:35.702400 systemd[1]: Started sshd@38-145.40.90.207:22-139.178.68.195:47266.service. Feb 13 08:10:35.701000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.90.207:22-139.178.68.195:47266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:35.730146 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:10:35.730203 kernel: audit: type=1130 audit(1707811835.701:1589): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.90.207:22-139.178.68.195:47266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:35.846000 audit[12118]: USER_ACCT pid=12118 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:35.848710 sshd[12118]: Accepted publickey for core from 139.178.68.195 port 47266 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:35.849971 sshd[12118]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:35.852349 systemd-logind[1446]: New session 36 of user core. Feb 13 08:10:35.852880 systemd[1]: Started session-36.scope. Feb 13 08:10:35.933829 sshd[12118]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:35.935371 systemd[1]: sshd@38-145.40.90.207:22-139.178.68.195:47266.service: Deactivated successfully. Feb 13 08:10:35.935844 systemd[1]: session-36.scope: Deactivated successfully. Feb 13 08:10:35.936299 systemd-logind[1446]: Session 36 logged out. Waiting for processes to exit. Feb 13 08:10:35.936955 systemd-logind[1446]: Removed session 36. Feb 13 08:10:35.848000 audit[12118]: CRED_ACQ pid=12118 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:36.033958 kernel: audit: type=1101 audit(1707811835.846:1590): pid=12118 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:36.033995 kernel: audit: type=1103 audit(1707811835.848:1591): pid=12118 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:36.034012 kernel: audit: type=1006 audit(1707811835.848:1592): pid=12118 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=36 res=1 Feb 13 08:10:36.092839 kernel: audit: type=1300 audit(1707811835.848:1592): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3939b5f0 a2=3 a3=0 items=0 ppid=1 pid=12118 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:35.848000 audit[12118]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3939b5f0 a2=3 a3=0 items=0 ppid=1 pid=12118 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:36.185319 kernel: audit: type=1327 audit(1707811835.848:1592): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:35.848000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:36.215909 kernel: audit: type=1105 audit(1707811835.853:1593): pid=12118 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:35.853000 audit[12118]: USER_START pid=12118 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:36.310850 kernel: audit: type=1103 audit(1707811835.854:1594): pid=12120 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:35.854000 audit[12120]: CRED_ACQ pid=12120 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:36.400529 kernel: audit: type=1106 audit(1707811835.932:1595): pid=12118 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:35.932000 audit[12118]: USER_END pid=12118 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:36.496593 kernel: audit: type=1104 audit(1707811835.932:1596): pid=12118 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:35.932000 audit[12118]: CRED_DISP pid=12118 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:35.933000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.90.207:22-139.178.68.195:47266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:36.705978 env[1458]: time="2024-02-13T08:10:36.705847147Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:10:36.731029 env[1458]: time="2024-02-13T08:10:36.730995677Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:36.731230 kubelet[2569]: E0213 08:10:36.731188 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:10:36.731230 kubelet[2569]: E0213 08:10:36.731213 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:10:36.731426 kubelet[2569]: E0213 08:10:36.731234 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:36.731426 kubelet[2569]: E0213 08:10:36.731255 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:10:40.706239 env[1458]: time="2024-02-13T08:10:40.706144650Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:10:40.759851 env[1458]: time="2024-02-13T08:10:40.759775845Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:40.760101 kubelet[2569]: E0213 08:10:40.760076 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:10:40.760489 kubelet[2569]: E0213 08:10:40.760126 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:10:40.760489 kubelet[2569]: E0213 08:10:40.760181 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:40.760489 kubelet[2569]: E0213 08:10:40.760223 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:10:40.944116 systemd[1]: Started sshd@39-145.40.90.207:22-139.178.68.195:45752.service. Feb 13 08:10:40.942000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.90.207:22-139.178.68.195:45752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:40.971187 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:10:40.971314 kernel: audit: type=1130 audit(1707811840.942:1598): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.90.207:22-139.178.68.195:45752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:41.087000 audit[12202]: USER_ACCT pid=12202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.089690 sshd[12202]: Accepted publickey for core from 139.178.68.195 port 45752 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:41.090800 sshd[12202]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:41.093108 systemd-logind[1446]: New session 37 of user core. Feb 13 08:10:41.093603 systemd[1]: Started session-37.scope. Feb 13 08:10:41.172986 sshd[12202]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:41.174348 systemd[1]: sshd@39-145.40.90.207:22-139.178.68.195:45752.service: Deactivated successfully. Feb 13 08:10:41.174784 systemd[1]: session-37.scope: Deactivated successfully. Feb 13 08:10:41.175209 systemd-logind[1446]: Session 37 logged out. Waiting for processes to exit. Feb 13 08:10:41.175598 systemd-logind[1446]: Removed session 37. Feb 13 08:10:41.088000 audit[12202]: CRED_ACQ pid=12202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.273591 kernel: audit: type=1101 audit(1707811841.087:1599): pid=12202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.273641 kernel: audit: type=1103 audit(1707811841.088:1600): pid=12202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.273659 kernel: audit: type=1006 audit(1707811841.088:1601): pid=12202 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=37 res=1 Feb 13 08:10:41.332490 kernel: audit: type=1300 audit(1707811841.088:1601): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffacbb9c10 a2=3 a3=0 items=0 ppid=1 pid=12202 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:41.088000 audit[12202]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffacbb9c10 a2=3 a3=0 items=0 ppid=1 pid=12202 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:41.424890 kernel: audit: type=1327 audit(1707811841.088:1601): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:41.088000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:41.455512 kernel: audit: type=1105 audit(1707811841.094:1602): pid=12202 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.094000 audit[12202]: USER_START pid=12202 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.550580 kernel: audit: type=1103 audit(1707811841.094:1603): pid=12204 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.094000 audit[12204]: CRED_ACQ pid=12204 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.639818 kernel: audit: type=1106 audit(1707811841.171:1604): pid=12202 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.171000 audit[12202]: USER_END pid=12202 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.735260 kernel: audit: type=1104 audit(1707811841.172:1605): pid=12202 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.172000 audit[12202]: CRED_DISP pid=12202 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:41.172000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.90.207:22-139.178.68.195:45752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:46.120000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.154932 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:10:46.155006 kernel: audit: type=1400 audit(1707811846.120:1607): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.175675 systemd[1]: Started sshd@40-145.40.90.207:22-139.178.68.195:33346.service. Feb 13 08:10:46.120000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.333454 kernel: audit: type=1400 audit(1707811846.120:1608): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.333496 kernel: audit: type=1300 audit(1707811846.120:1607): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00278d4a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:10:46.120000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00278d4a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:10:46.361011 sshd[12228]: Accepted publickey for core from 139.178.68.195 port 33346 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:46.362288 sshd[12228]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:46.364573 systemd-logind[1446]: New session 38 of user core. Feb 13 08:10:46.365084 systemd[1]: Started session-38.scope. Feb 13 08:10:46.443653 sshd[12228]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:46.445113 systemd[1]: sshd@40-145.40.90.207:22-139.178.68.195:33346.service: Deactivated successfully. Feb 13 08:10:46.445607 systemd[1]: session-38.scope: Deactivated successfully. Feb 13 08:10:46.445946 systemd-logind[1446]: Session 38 logged out. Waiting for processes to exit. Feb 13 08:10:46.446317 systemd-logind[1446]: Removed session 38. Feb 13 08:10:46.452777 kernel: audit: type=1327 audit(1707811846.120:1607): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:10:46.120000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:10:46.545123 kernel: audit: type=1300 audit(1707811846.120:1608): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0005d05a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:10:46.120000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0005d05a0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:10:46.120000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:10:46.706089 env[1458]: time="2024-02-13T08:10:46.706005730Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:10:46.718357 env[1458]: time="2024-02-13T08:10:46.718276413Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:46.718518 kubelet[2569]: E0213 08:10:46.718506 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:10:46.718689 kubelet[2569]: E0213 08:10:46.718534 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:10:46.718689 kubelet[2569]: E0213 08:10:46.718557 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:46.718689 kubelet[2569]: E0213 08:10:46.718574 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:10:46.756941 kernel: audit: type=1327 audit(1707811846.120:1608): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:10:46.756985 kernel: audit: type=1130 audit(1707811846.173:1609): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.90.207:22-139.178.68.195:33346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:46.173000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.90.207:22-139.178.68.195:33346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:46.846043 kernel: audit: type=1400 audit(1707811846.186:1610): avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.186000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.936492 kernel: audit: type=1300 audit(1707811846.186:1610): arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00df7c2d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:10:46.186000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00df7c2d0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:10:46.186000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:10:47.127380 kernel: audit: type=1327 audit(1707811846.186:1610): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:10:46.186000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.186000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00df7c300 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:10:46.186000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:10:46.186000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.186000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0028a4560 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:10:46.186000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:10:46.357000 audit[12228]: USER_ACCT pid=12228 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:46.360000 audit[12228]: CRED_ACQ pid=12228 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:46.360000 audit[12228]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff34c612d0 a2=3 a3=0 items=0 ppid=1 pid=12228 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:46.360000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:46.362000 audit[12228]: USER_START pid=12228 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:46.362000 audit[12230]: CRED_ACQ pid=12230 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:46.442000 audit[12228]: USER_END pid=12228 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:46.442000 audit[12228]: CRED_DISP pid=12228 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:46.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.90.207:22-139.178.68.195:33346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:46.930000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.930000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.930000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00b5a6e10 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:10:46.930000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c009a6cfc0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:10:46.930000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:10:46.930000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:10:46.930000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:46.930000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e5e2ac0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:10:46.930000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:10:47.706018 env[1458]: time="2024-02-13T08:10:47.705922912Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:10:47.757287 env[1458]: time="2024-02-13T08:10:47.757193399Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:47.757744 kubelet[2569]: E0213 08:10:47.757451 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:10:47.757744 kubelet[2569]: E0213 08:10:47.757500 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:10:47.757744 kubelet[2569]: E0213 08:10:47.757552 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:47.757744 kubelet[2569]: E0213 08:10:47.757595 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:10:50.707106 env[1458]: time="2024-02-13T08:10:50.706980034Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:10:50.733795 env[1458]: time="2024-02-13T08:10:50.733759708Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:50.733972 kubelet[2569]: E0213 08:10:50.733931 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:10:50.733972 kubelet[2569]: E0213 08:10:50.733954 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:10:50.734146 kubelet[2569]: E0213 08:10:50.733975 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:50.734146 kubelet[2569]: E0213 08:10:50.733992 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:10:50.870000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:50.870000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000bc2380 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:10:50.870000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:10:50.876000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:50.876000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002924b20 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:10:50.876000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:10:50.876000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:50.876000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002b27520 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:10:50.876000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:10:50.879000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:10:50.879000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000d7e7e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:10:50.879000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:10:51.453365 systemd[1]: Started sshd@41-145.40.90.207:22-139.178.68.195:33360.service. Feb 13 08:10:51.451000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-145.40.90.207:22-139.178.68.195:33360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:51.480364 kernel: kauditd_printk_skb: 37 callbacks suppressed Feb 13 08:10:51.480421 kernel: audit: type=1130 audit(1707811851.451:1628): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-145.40.90.207:22-139.178.68.195:33360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:51.596000 audit[12347]: USER_ACCT pid=12347 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.598665 sshd[12347]: Accepted publickey for core from 139.178.68.195 port 33360 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:51.599900 sshd[12347]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:51.602137 systemd-logind[1446]: New session 39 of user core. Feb 13 08:10:51.602600 systemd[1]: Started session-39.scope. Feb 13 08:10:51.683334 sshd[12347]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:51.684782 systemd[1]: sshd@41-145.40.90.207:22-139.178.68.195:33360.service: Deactivated successfully. Feb 13 08:10:51.685229 systemd[1]: session-39.scope: Deactivated successfully. Feb 13 08:10:51.685567 systemd-logind[1446]: Session 39 logged out. Waiting for processes to exit. Feb 13 08:10:51.686181 systemd-logind[1446]: Removed session 39. Feb 13 08:10:51.598000 audit[12347]: CRED_ACQ pid=12347 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.782579 kernel: audit: type=1101 audit(1707811851.596:1629): pid=12347 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.782616 kernel: audit: type=1103 audit(1707811851.598:1630): pid=12347 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.782636 kernel: audit: type=1006 audit(1707811851.598:1631): pid=12347 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=39 res=1 Feb 13 08:10:51.841381 kernel: audit: type=1300 audit(1707811851.598:1631): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe61318e60 a2=3 a3=0 items=0 ppid=1 pid=12347 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:51.598000 audit[12347]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe61318e60 a2=3 a3=0 items=0 ppid=1 pid=12347 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:51.933810 kernel: audit: type=1327 audit(1707811851.598:1631): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:51.598000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:51.964714 kernel: audit: type=1105 audit(1707811851.603:1632): pid=12347 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.603000 audit[12347]: USER_START pid=12347 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:52.060456 kernel: audit: type=1103 audit(1707811851.604:1633): pid=12349 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.604000 audit[12349]: CRED_ACQ pid=12349 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:52.150125 kernel: audit: type=1106 audit(1707811851.682:1634): pid=12347 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.682000 audit[12347]: USER_END pid=12347 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:52.245938 kernel: audit: type=1104 audit(1707811851.682:1635): pid=12347 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.682000 audit[12347]: CRED_DISP pid=12347 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:51.683000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-145.40.90.207:22-139.178.68.195:33360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:52.706847 env[1458]: time="2024-02-13T08:10:52.706751223Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:10:52.761540 env[1458]: time="2024-02-13T08:10:52.761469040Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:10:52.761825 kubelet[2569]: E0213 08:10:52.761761 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:10:52.761825 kubelet[2569]: E0213 08:10:52.761812 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:10:52.762320 kubelet[2569]: E0213 08:10:52.761866 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:10:52.762320 kubelet[2569]: E0213 08:10:52.761910 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:10:56.692365 systemd[1]: Started sshd@42-145.40.90.207:22-139.178.68.195:50640.service. Feb 13 08:10:56.690000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-145.40.90.207:22-139.178.68.195:50640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:56.719684 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:10:56.719747 kernel: audit: type=1130 audit(1707811856.690:1637): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-145.40.90.207:22-139.178.68.195:50640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:10:56.836000 audit[12403]: USER_ACCT pid=12403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:56.838644 sshd[12403]: Accepted publickey for core from 139.178.68.195 port 50640 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:10:56.839928 sshd[12403]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:10:56.842314 systemd-logind[1446]: New session 40 of user core. Feb 13 08:10:56.842842 systemd[1]: Started session-40.scope. Feb 13 08:10:56.922515 sshd[12403]: pam_unix(sshd:session): session closed for user core Feb 13 08:10:56.924078 systemd[1]: sshd@42-145.40.90.207:22-139.178.68.195:50640.service: Deactivated successfully. Feb 13 08:10:56.924509 systemd[1]: session-40.scope: Deactivated successfully. Feb 13 08:10:56.924894 systemd-logind[1446]: Session 40 logged out. Waiting for processes to exit. Feb 13 08:10:56.925452 systemd-logind[1446]: Removed session 40. Feb 13 08:10:56.838000 audit[12403]: CRED_ACQ pid=12403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:57.021020 kernel: audit: type=1101 audit(1707811856.836:1638): pid=12403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:57.021061 kernel: audit: type=1103 audit(1707811856.838:1639): pid=12403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:57.021079 kernel: audit: type=1006 audit(1707811856.838:1640): pid=12403 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=40 res=1 Feb 13 08:10:57.079710 kernel: audit: type=1300 audit(1707811856.838:1640): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef720de50 a2=3 a3=0 items=0 ppid=1 pid=12403 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:56.838000 audit[12403]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef720de50 a2=3 a3=0 items=0 ppid=1 pid=12403 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:10:57.171709 kernel: audit: type=1327 audit(1707811856.838:1640): proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:56.838000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:10:57.202173 kernel: audit: type=1105 audit(1707811856.843:1641): pid=12403 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:56.843000 audit[12403]: USER_START pid=12403 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:57.296800 kernel: audit: type=1103 audit(1707811856.843:1642): pid=12405 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:56.843000 audit[12405]: CRED_ACQ pid=12405 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:57.386038 kernel: audit: type=1106 audit(1707811856.921:1643): pid=12403 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:56.921000 audit[12403]: USER_END pid=12403 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:57.481572 kernel: audit: type=1104 audit(1707811856.921:1644): pid=12403 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:56.921000 audit[12403]: CRED_DISP pid=12403 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:10:56.922000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-145.40.90.207:22-139.178.68.195:50640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:01.706480 env[1458]: time="2024-02-13T08:11:01.706429404Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:11:01.723028 env[1458]: time="2024-02-13T08:11:01.722967455Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:01.723172 kubelet[2569]: E0213 08:11:01.723160 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:11:01.723344 kubelet[2569]: E0213 08:11:01.723187 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:11:01.723344 kubelet[2569]: E0213 08:11:01.723210 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:01.723344 kubelet[2569]: E0213 08:11:01.723227 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:11:01.931642 systemd[1]: Started sshd@43-145.40.90.207:22-139.178.68.195:50652.service. Feb 13 08:11:01.930000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.90.207:22-139.178.68.195:50652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:01.958313 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:11:01.958409 kernel: audit: type=1130 audit(1707811861.930:1646): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.90.207:22-139.178.68.195:50652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:02.072000 audit[12456]: USER_ACCT pid=12456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.074737 sshd[12456]: Accepted publickey for core from 139.178.68.195 port 50652 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:02.075920 sshd[12456]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:02.078236 systemd-logind[1446]: New session 41 of user core. Feb 13 08:11:02.078740 systemd[1]: Started session-41.scope. Feb 13 08:11:02.159708 sshd[12456]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:02.161224 systemd[1]: sshd@43-145.40.90.207:22-139.178.68.195:50652.service: Deactivated successfully. Feb 13 08:11:02.161662 systemd[1]: session-41.scope: Deactivated successfully. Feb 13 08:11:02.162050 systemd-logind[1446]: Session 41 logged out. Waiting for processes to exit. Feb 13 08:11:02.162527 systemd-logind[1446]: Removed session 41. Feb 13 08:11:02.074000 audit[12456]: CRED_ACQ pid=12456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.168719 kernel: audit: type=1101 audit(1707811862.072:1647): pid=12456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.168757 kernel: audit: type=1103 audit(1707811862.074:1648): pid=12456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.317310 kernel: audit: type=1006 audit(1707811862.074:1649): pid=12456 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=41 res=1 Feb 13 08:11:02.317347 kernel: audit: type=1300 audit(1707811862.074:1649): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc3d4e7a50 a2=3 a3=0 items=0 ppid=1 pid=12456 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:02.074000 audit[12456]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc3d4e7a50 a2=3 a3=0 items=0 ppid=1 pid=12456 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:02.409329 kernel: audit: type=1327 audit(1707811862.074:1649): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:02.074000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:02.439771 kernel: audit: type=1105 audit(1707811862.079:1650): pid=12456 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.079000 audit[12456]: USER_START pid=12456 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.534408 kernel: audit: type=1103 audit(1707811862.080:1651): pid=12458 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.080000 audit[12458]: CRED_ACQ pid=12458 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.623716 kernel: audit: type=1106 audit(1707811862.158:1652): pid=12456 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.158000 audit[12456]: USER_END pid=12456 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.705403 env[1458]: time="2024-02-13T08:11:02.705366889Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:11:02.705403 env[1458]: time="2024-02-13T08:11:02.705378802Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:11:02.717553 env[1458]: time="2024-02-13T08:11:02.717510725Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:02.717855 env[1458]: time="2024-02-13T08:11:02.717790434Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:02.717891 kubelet[2569]: E0213 08:11:02.717724 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:11:02.717891 kubelet[2569]: E0213 08:11:02.717757 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:11:02.717891 kubelet[2569]: E0213 08:11:02.717792 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:02.717891 kubelet[2569]: E0213 08:11:02.717819 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:11:02.718033 kubelet[2569]: E0213 08:11:02.717895 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:11:02.718033 kubelet[2569]: E0213 08:11:02.717910 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:11:02.718033 kubelet[2569]: E0213 08:11:02.717927 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:02.718033 kubelet[2569]: E0213 08:11:02.717942 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:11:02.158000 audit[12456]: CRED_DISP pid=12456 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.809411 kernel: audit: type=1104 audit(1707811862.158:1653): pid=12456 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:02.159000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.90.207:22-139.178.68.195:50652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:03.707026 env[1458]: time="2024-02-13T08:11:03.706934902Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:11:03.725074 env[1458]: time="2024-02-13T08:11:03.725009817Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:03.725302 kubelet[2569]: E0213 08:11:03.725164 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:11:03.725302 kubelet[2569]: E0213 08:11:03.725190 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:11:03.725302 kubelet[2569]: E0213 08:11:03.725214 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:03.725302 kubelet[2569]: E0213 08:11:03.725232 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:11:07.169164 systemd[1]: Started sshd@44-145.40.90.207:22-139.178.68.195:50614.service. Feb 13 08:11:07.167000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.90.207:22-139.178.68.195:50614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:07.195657 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:11:07.195758 kernel: audit: type=1130 audit(1707811867.167:1655): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.90.207:22-139.178.68.195:50614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:07.311000 audit[12568]: USER_ACCT pid=12568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.313100 sshd[12568]: Accepted publickey for core from 139.178.68.195 port 50614 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:07.313920 sshd[12568]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:07.316581 systemd-logind[1446]: New session 42 of user core. Feb 13 08:11:07.317613 systemd[1]: Started session-42.scope. Feb 13 08:11:07.399276 sshd[12568]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:07.401053 systemd[1]: sshd@44-145.40.90.207:22-139.178.68.195:50614.service: Deactivated successfully. Feb 13 08:11:07.401889 systemd[1]: session-42.scope: Deactivated successfully. Feb 13 08:11:07.402501 systemd-logind[1446]: Session 42 logged out. Waiting for processes to exit. Feb 13 08:11:07.403161 systemd-logind[1446]: Removed session 42. Feb 13 08:11:07.312000 audit[12568]: CRED_ACQ pid=12568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.497800 kernel: audit: type=1101 audit(1707811867.311:1656): pid=12568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.497843 kernel: audit: type=1103 audit(1707811867.312:1657): pid=12568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.497863 kernel: audit: type=1006 audit(1707811867.312:1658): pid=12568 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=42 res=1 Feb 13 08:11:07.556433 kernel: audit: type=1300 audit(1707811867.312:1658): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe10ad97a0 a2=3 a3=0 items=0 ppid=1 pid=12568 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:07.312000 audit[12568]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe10ad97a0 a2=3 a3=0 items=0 ppid=1 pid=12568 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:07.648443 kernel: audit: type=1327 audit(1707811867.312:1658): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:07.312000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:07.678916 kernel: audit: type=1105 audit(1707811867.318:1659): pid=12568 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.318000 audit[12568]: USER_START pid=12568 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.773379 kernel: audit: type=1103 audit(1707811867.318:1660): pid=12570 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.318000 audit[12570]: CRED_ACQ pid=12570 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.862581 kernel: audit: type=1106 audit(1707811867.398:1661): pid=12568 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.398000 audit[12568]: USER_END pid=12568 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.958070 kernel: audit: type=1104 audit(1707811867.398:1662): pid=12568 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.398000 audit[12568]: CRED_DISP pid=12568 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:07.399000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.90.207:22-139.178.68.195:50614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:12.409121 systemd[1]: Started sshd@45-145.40.90.207:22-139.178.68.195:50622.service. Feb 13 08:11:12.407000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.90.207:22-139.178.68.195:50622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:12.436111 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:11:12.436220 kernel: audit: type=1130 audit(1707811872.407:1664): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.90.207:22-139.178.68.195:50622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:12.553000 audit[12593]: USER_ACCT pid=12593 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:12.555157 sshd[12593]: Accepted publickey for core from 139.178.68.195 port 50622 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:12.557379 sshd[12593]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:12.562442 systemd-logind[1446]: New session 43 of user core. Feb 13 08:11:12.563527 systemd[1]: Started session-43.scope. Feb 13 08:11:12.641704 sshd[12593]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:12.643226 systemd[1]: sshd@45-145.40.90.207:22-139.178.68.195:50622.service: Deactivated successfully. Feb 13 08:11:12.643680 systemd[1]: session-43.scope: Deactivated successfully. Feb 13 08:11:12.644084 systemd-logind[1446]: Session 43 logged out. Waiting for processes to exit. Feb 13 08:11:12.644512 systemd-logind[1446]: Removed session 43. Feb 13 08:11:12.555000 audit[12593]: CRED_ACQ pid=12593 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:12.736974 kernel: audit: type=1101 audit(1707811872.553:1665): pid=12593 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:12.737019 kernel: audit: type=1103 audit(1707811872.555:1666): pid=12593 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:12.737036 kernel: audit: type=1006 audit(1707811872.555:1667): pid=12593 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=43 res=1 Feb 13 08:11:12.795598 kernel: audit: type=1300 audit(1707811872.555:1667): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff69a9bb50 a2=3 a3=0 items=0 ppid=1 pid=12593 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:12.555000 audit[12593]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff69a9bb50 a2=3 a3=0 items=0 ppid=1 pid=12593 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:12.887588 kernel: audit: type=1327 audit(1707811872.555:1667): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:12.555000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:12.564000 audit[12593]: USER_START pid=12593 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:13.012511 kernel: audit: type=1105 audit(1707811872.564:1668): pid=12593 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:13.012547 kernel: audit: type=1103 audit(1707811872.564:1669): pid=12595 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:12.564000 audit[12595]: CRED_ACQ pid=12595 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:13.101711 kernel: audit: type=1106 audit(1707811872.640:1670): pid=12593 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:12.640000 audit[12593]: USER_END pid=12593 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:12.640000 audit[12593]: CRED_DISP pid=12593 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:13.286479 kernel: audit: type=1104 audit(1707811872.640:1671): pid=12593 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:12.641000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.90.207:22-139.178.68.195:50622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:14.711986 env[1458]: time="2024-02-13T08:11:14.711885238Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:11:14.712959 env[1458]: time="2024-02-13T08:11:14.711885246Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:11:14.772939 env[1458]: time="2024-02-13T08:11:14.772808702Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:14.773258 kubelet[2569]: E0213 08:11:14.773202 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:11:14.773824 kubelet[2569]: E0213 08:11:14.773265 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:11:14.773824 kubelet[2569]: E0213 08:11:14.773334 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:14.773824 kubelet[2569]: E0213 08:11:14.773384 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:11:14.773824 kubelet[2569]: E0213 08:11:14.773693 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:11:14.773824 kubelet[2569]: E0213 08:11:14.773725 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:11:14.774325 env[1458]: time="2024-02-13T08:11:14.773446215Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:14.774416 kubelet[2569]: E0213 08:11:14.773781 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:14.774416 kubelet[2569]: E0213 08:11:14.773824 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:11:15.706275 env[1458]: time="2024-02-13T08:11:15.706234239Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:11:15.725493 env[1458]: time="2024-02-13T08:11:15.725459958Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:15.725757 kubelet[2569]: E0213 08:11:15.725646 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:11:15.725757 kubelet[2569]: E0213 08:11:15.725696 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:11:15.725757 kubelet[2569]: E0213 08:11:15.725718 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:15.725757 kubelet[2569]: E0213 08:11:15.725736 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:11:16.705672 env[1458]: time="2024-02-13T08:11:16.705610231Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:11:16.722420 env[1458]: time="2024-02-13T08:11:16.722358809Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:16.722539 kubelet[2569]: E0213 08:11:16.722521 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:11:16.722703 kubelet[2569]: E0213 08:11:16.722548 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:11:16.722703 kubelet[2569]: E0213 08:11:16.722570 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:16.722703 kubelet[2569]: E0213 08:11:16.722589 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:11:17.651072 systemd[1]: Started sshd@46-145.40.90.207:22-139.178.68.195:48228.service. Feb 13 08:11:17.649000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.90.207:22-139.178.68.195:48228 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:17.678170 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:11:17.678245 kernel: audit: type=1130 audit(1707811877.649:1673): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.90.207:22-139.178.68.195:48228 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:17.794000 audit[12740]: USER_ACCT pid=12740 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.795890 sshd[12740]: Accepted publickey for core from 139.178.68.195 port 48228 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:17.797301 sshd[12740]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:17.799612 systemd-logind[1446]: New session 44 of user core. Feb 13 08:11:17.800181 systemd[1]: Started session-44.scope. Feb 13 08:11:17.878928 sshd[12740]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:17.880369 systemd[1]: sshd@46-145.40.90.207:22-139.178.68.195:48228.service: Deactivated successfully. Feb 13 08:11:17.880821 systemd[1]: session-44.scope: Deactivated successfully. Feb 13 08:11:17.881181 systemd-logind[1446]: Session 44 logged out. Waiting for processes to exit. Feb 13 08:11:17.881549 systemd-logind[1446]: Removed session 44. Feb 13 08:11:17.795000 audit[12740]: CRED_ACQ pid=12740 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.978637 kernel: audit: type=1101 audit(1707811877.794:1674): pid=12740 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.978722 kernel: audit: type=1103 audit(1707811877.795:1675): pid=12740 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.978743 kernel: audit: type=1006 audit(1707811877.795:1676): pid=12740 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=44 res=1 Feb 13 08:11:17.795000 audit[12740]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd2b0471e0 a2=3 a3=0 items=0 ppid=1 pid=12740 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:18.129308 kernel: audit: type=1300 audit(1707811877.795:1676): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd2b0471e0 a2=3 a3=0 items=0 ppid=1 pid=12740 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:18.129390 kernel: audit: type=1327 audit(1707811877.795:1676): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:17.795000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:18.159832 kernel: audit: type=1105 audit(1707811877.800:1677): pid=12740 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.800000 audit[12740]: USER_START pid=12740 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:18.254329 kernel: audit: type=1103 audit(1707811877.801:1678): pid=12742 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.801000 audit[12742]: CRED_ACQ pid=12742 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:18.343532 kernel: audit: type=1106 audit(1707811877.878:1679): pid=12740 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.878000 audit[12740]: USER_END pid=12740 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.878000 audit[12740]: CRED_DISP pid=12740 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:18.528359 kernel: audit: type=1104 audit(1707811877.878:1680): pid=12740 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:17.878000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.90.207:22-139.178.68.195:48228 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:22.888045 systemd[1]: Started sshd@47-145.40.90.207:22-139.178.68.195:48236.service. Feb 13 08:11:22.886000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.90.207:22-139.178.68.195:48236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:22.914959 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:11:22.915061 kernel: audit: type=1130 audit(1707811882.886:1682): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.90.207:22-139.178.68.195:48236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:23.031030 sshd[12765]: Accepted publickey for core from 139.178.68.195 port 48236 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:23.029000 audit[12765]: USER_ACCT pid=12765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.031966 sshd[12765]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:23.034352 systemd-logind[1446]: New session 45 of user core. Feb 13 08:11:23.034927 systemd[1]: Started session-45.scope. Feb 13 08:11:23.114872 sshd[12765]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:23.116855 systemd[1]: sshd@47-145.40.90.207:22-139.178.68.195:48236.service: Deactivated successfully. Feb 13 08:11:23.117290 systemd[1]: session-45.scope: Deactivated successfully. Feb 13 08:11:23.117597 systemd-logind[1446]: Session 45 logged out. Waiting for processes to exit. Feb 13 08:11:23.118321 systemd[1]: Started sshd@48-145.40.90.207:22-139.178.68.195:48250.service. Feb 13 08:11:23.118664 systemd-logind[1446]: Removed session 45. Feb 13 08:11:23.030000 audit[12765]: CRED_ACQ pid=12765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.215339 kernel: audit: type=1101 audit(1707811883.029:1683): pid=12765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.215382 kernel: audit: type=1103 audit(1707811883.030:1684): pid=12765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.215402 kernel: audit: type=1006 audit(1707811883.030:1685): pid=12765 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=45 res=1 Feb 13 08:11:23.030000 audit[12765]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffedc60a2f0 a2=3 a3=0 items=0 ppid=1 pid=12765 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:23.301789 sshd[12791]: Accepted publickey for core from 139.178.68.195 port 48250 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:23.302955 sshd[12791]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:23.305210 systemd-logind[1446]: New session 46 of user core. Feb 13 08:11:23.305703 systemd[1]: Started session-46.scope. Feb 13 08:11:23.365965 kernel: audit: type=1300 audit(1707811883.030:1685): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffedc60a2f0 a2=3 a3=0 items=0 ppid=1 pid=12765 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:23.366044 kernel: audit: type=1327 audit(1707811883.030:1685): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:23.030000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:23.396481 kernel: audit: type=1105 audit(1707811883.035:1686): pid=12765 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.035000 audit[12765]: USER_START pid=12765 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.490923 kernel: audit: type=1103 audit(1707811883.035:1687): pid=12767 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.035000 audit[12767]: CRED_ACQ pid=12767 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.580127 kernel: audit: type=1106 audit(1707811883.113:1688): pid=12765 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.113000 audit[12765]: USER_END pid=12765 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.113000 audit[12765]: CRED_DISP pid=12765 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.676692 kernel: audit: type=1104 audit(1707811883.113:1689): pid=12765 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.115000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.90.207:22-139.178.68.195:48236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:23.116000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-145.40.90.207:22-139.178.68.195:48250 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:23.300000 audit[12791]: USER_ACCT pid=12791 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.301000 audit[12791]: CRED_ACQ pid=12791 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.301000 audit[12791]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe9252e6a0 a2=3 a3=0 items=0 ppid=1 pid=12791 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:23.301000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:23.306000 audit[12791]: USER_START pid=12791 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.306000 audit[12793]: CRED_ACQ pid=12793 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.771185 sshd[12791]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:23.770000 audit[12791]: USER_END pid=12791 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.770000 audit[12791]: CRED_DISP pid=12791 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.773117 systemd[1]: sshd@48-145.40.90.207:22-139.178.68.195:48250.service: Deactivated successfully. Feb 13 08:11:23.771000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-145.40.90.207:22-139.178.68.195:48250 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:23.773540 systemd[1]: session-46.scope: Deactivated successfully. Feb 13 08:11:23.773876 systemd-logind[1446]: Session 46 logged out. Waiting for processes to exit. Feb 13 08:11:23.774709 systemd[1]: Started sshd@49-145.40.90.207:22-139.178.68.195:48252.service. Feb 13 08:11:23.773000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-145.40.90.207:22-139.178.68.195:48252 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:23.775086 systemd-logind[1446]: Removed session 46. Feb 13 08:11:23.807000 audit[12814]: USER_ACCT pid=12814 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.809563 sshd[12814]: Accepted publickey for core from 139.178.68.195 port 48252 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:23.809000 audit[12814]: CRED_ACQ pid=12814 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.810000 audit[12814]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd3f637ca0 a2=3 a3=0 items=0 ppid=1 pid=12814 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:23.810000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:23.812450 sshd[12814]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:23.822182 systemd-logind[1446]: New session 47 of user core. Feb 13 08:11:23.825822 systemd[1]: Started session-47.scope. Feb 13 08:11:23.837000 audit[12814]: USER_START pid=12814 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.840000 audit[12819]: CRED_ACQ pid=12819 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.967449 sshd[12814]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:23.966000 audit[12814]: USER_END pid=12814 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.966000 audit[12814]: CRED_DISP pid=12814 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:23.968961 systemd[1]: sshd@49-145.40.90.207:22-139.178.68.195:48252.service: Deactivated successfully. Feb 13 08:11:23.967000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-145.40.90.207:22-139.178.68.195:48252 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:23.969442 systemd[1]: session-47.scope: Deactivated successfully. Feb 13 08:11:23.969857 systemd-logind[1446]: Session 47 logged out. Waiting for processes to exit. Feb 13 08:11:23.970273 systemd-logind[1446]: Removed session 47. Feb 13 08:11:26.705944 env[1458]: time="2024-02-13T08:11:26.705907248Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:11:26.722589 env[1458]: time="2024-02-13T08:11:26.722550897Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:26.722769 kubelet[2569]: E0213 08:11:26.722754 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:11:26.722989 kubelet[2569]: E0213 08:11:26.722787 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:11:26.722989 kubelet[2569]: E0213 08:11:26.722823 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:26.722989 kubelet[2569]: E0213 08:11:26.722854 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:11:27.706977 env[1458]: time="2024-02-13T08:11:27.706858263Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:11:27.733406 env[1458]: time="2024-02-13T08:11:27.733344993Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:27.733600 kubelet[2569]: E0213 08:11:27.733587 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:11:27.733803 kubelet[2569]: E0213 08:11:27.733617 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:11:27.733803 kubelet[2569]: E0213 08:11:27.733678 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:27.733803 kubelet[2569]: E0213 08:11:27.733704 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:11:28.706073 env[1458]: time="2024-02-13T08:11:28.705980847Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:11:28.735859 env[1458]: time="2024-02-13T08:11:28.735783763Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:28.736150 kubelet[2569]: E0213 08:11:28.736007 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:11:28.736150 kubelet[2569]: E0213 08:11:28.736041 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:11:28.736150 kubelet[2569]: E0213 08:11:28.736072 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:28.736150 kubelet[2569]: E0213 08:11:28.736097 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:11:28.977263 systemd[1]: Started sshd@50-145.40.90.207:22-139.178.68.195:54918.service. Feb 13 08:11:28.975000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.90.207:22-139.178.68.195:54918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:29.004278 kernel: kauditd_printk_skb: 23 callbacks suppressed Feb 13 08:11:29.004334 kernel: audit: type=1130 audit(1707811888.975:1709): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.90.207:22-139.178.68.195:54918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:29.119000 audit[12931]: USER_ACCT pid=12931 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.121103 sshd[12931]: Accepted publickey for core from 139.178.68.195 port 54918 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:29.123917 sshd[12931]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:29.126109 systemd-logind[1446]: New session 48 of user core. Feb 13 08:11:29.126831 systemd[1]: Started session-48.scope. Feb 13 08:11:29.122000 audit[12931]: CRED_ACQ pid=12931 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.217444 sshd[12931]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:29.218857 systemd[1]: sshd@50-145.40.90.207:22-139.178.68.195:54918.service: Deactivated successfully. Feb 13 08:11:29.219307 systemd[1]: session-48.scope: Deactivated successfully. Feb 13 08:11:29.219602 systemd-logind[1446]: Session 48 logged out. Waiting for processes to exit. Feb 13 08:11:29.220268 systemd-logind[1446]: Removed session 48. Feb 13 08:11:29.305641 kernel: audit: type=1101 audit(1707811889.119:1710): pid=12931 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.305678 kernel: audit: type=1103 audit(1707811889.122:1711): pid=12931 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.305697 kernel: audit: type=1006 audit(1707811889.122:1712): pid=12931 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=48 res=1 Feb 13 08:11:29.364269 kernel: audit: type=1300 audit(1707811889.122:1712): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcba13c450 a2=3 a3=0 items=0 ppid=1 pid=12931 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:29.122000 audit[12931]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcba13c450 a2=3 a3=0 items=0 ppid=1 pid=12931 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:29.456268 kernel: audit: type=1327 audit(1707811889.122:1712): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:29.122000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:29.486708 kernel: audit: type=1105 audit(1707811889.127:1713): pid=12931 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.127000 audit[12931]: USER_START pid=12931 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.581190 kernel: audit: type=1103 audit(1707811889.128:1714): pid=12933 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.128000 audit[12933]: CRED_ACQ pid=12933 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.216000 audit[12931]: USER_END pid=12931 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.705890 env[1458]: time="2024-02-13T08:11:29.705843223Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:11:29.717801 env[1458]: time="2024-02-13T08:11:29.717738000Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:29.717926 kubelet[2569]: E0213 08:11:29.717909 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:11:29.717960 kubelet[2569]: E0213 08:11:29.717936 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:11:29.717960 kubelet[2569]: E0213 08:11:29.717958 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:29.718029 kubelet[2569]: E0213 08:11:29.717977 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:11:29.765894 kernel: audit: type=1106 audit(1707811889.216:1715): pid=12931 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.765936 kernel: audit: type=1104 audit(1707811889.216:1716): pid=12931 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.216000 audit[12931]: CRED_DISP pid=12931 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:29.217000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.90.207:22-139.178.68.195:54918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:34.226813 systemd[1]: Started sshd@51-145.40.90.207:22-139.178.68.195:54924.service. Feb 13 08:11:34.225000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.90.207:22-139.178.68.195:54924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:34.253828 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:11:34.253913 kernel: audit: type=1130 audit(1707811894.225:1718): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.90.207:22-139.178.68.195:54924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:34.370000 audit[12987]: USER_ACCT pid=12987 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.372121 sshd[12987]: Accepted publickey for core from 139.178.68.195 port 54924 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:34.373012 sshd[12987]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:34.375212 systemd-logind[1446]: New session 49 of user core. Feb 13 08:11:34.375705 systemd[1]: Started session-49.scope. Feb 13 08:11:34.371000 audit[12987]: CRED_ACQ pid=12987 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.466035 sshd[12987]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:34.467387 systemd[1]: sshd@51-145.40.90.207:22-139.178.68.195:54924.service: Deactivated successfully. Feb 13 08:11:34.467827 systemd[1]: session-49.scope: Deactivated successfully. Feb 13 08:11:34.468238 systemd-logind[1446]: Session 49 logged out. Waiting for processes to exit. Feb 13 08:11:34.468723 systemd-logind[1446]: Removed session 49. Feb 13 08:11:34.554002 kernel: audit: type=1101 audit(1707811894.370:1719): pid=12987 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.554038 kernel: audit: type=1103 audit(1707811894.371:1720): pid=12987 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.554055 kernel: audit: type=1006 audit(1707811894.371:1721): pid=12987 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=49 res=1 Feb 13 08:11:34.612607 kernel: audit: type=1300 audit(1707811894.371:1721): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe29208df0 a2=3 a3=0 items=0 ppid=1 pid=12987 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:34.371000 audit[12987]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe29208df0 a2=3 a3=0 items=0 ppid=1 pid=12987 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:34.704622 kernel: audit: type=1327 audit(1707811894.371:1721): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:34.371000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:34.735099 kernel: audit: type=1105 audit(1707811894.376:1722): pid=12987 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.376000 audit[12987]: USER_START pid=12987 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.829602 kernel: audit: type=1103 audit(1707811894.376:1723): pid=12989 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.376000 audit[12989]: CRED_ACQ pid=12989 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.918850 kernel: audit: type=1106 audit(1707811894.465:1724): pid=12987 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.465000 audit[12987]: USER_END pid=12987 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:35.014386 kernel: audit: type=1104 audit(1707811894.465:1725): pid=12987 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.465000 audit[12987]: CRED_DISP pid=12987 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:34.465000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.90.207:22-139.178.68.195:54924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:38.706690 env[1458]: time="2024-02-13T08:11:38.706552015Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:11:38.733830 env[1458]: time="2024-02-13T08:11:38.733731398Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:38.734003 kubelet[2569]: E0213 08:11:38.733990 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:11:38.734199 kubelet[2569]: E0213 08:11:38.734034 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:11:38.734199 kubelet[2569]: E0213 08:11:38.734065 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:38.734199 kubelet[2569]: E0213 08:11:38.734090 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:11:39.476052 systemd[1]: Started sshd@52-145.40.90.207:22-139.178.68.195:56832.service. Feb 13 08:11:39.474000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-145.40.90.207:22-139.178.68.195:56832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:39.503123 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:11:39.503190 kernel: audit: type=1130 audit(1707811899.474:1727): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-145.40.90.207:22-139.178.68.195:56832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:39.620000 audit[13040]: USER_ACCT pid=13040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.621469 sshd[13040]: Accepted publickey for core from 139.178.68.195 port 56832 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:39.623932 sshd[13040]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:39.626039 systemd-logind[1446]: New session 50 of user core. Feb 13 08:11:39.626732 systemd[1]: Started session-50.scope. Feb 13 08:11:39.705668 env[1458]: time="2024-02-13T08:11:39.705642214Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:11:39.622000 audit[13040]: CRED_ACQ pid=13040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.717975 sshd[13040]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:39.719034 env[1458]: time="2024-02-13T08:11:39.718979627Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:39.719204 kubelet[2569]: E0213 08:11:39.719171 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:11:39.719204 kubelet[2569]: E0213 08:11:39.719200 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:11:39.719282 kubelet[2569]: E0213 08:11:39.719222 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:39.719282 kubelet[2569]: E0213 08:11:39.719240 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:11:39.719453 systemd[1]: sshd@52-145.40.90.207:22-139.178.68.195:56832.service: Deactivated successfully. Feb 13 08:11:39.719924 systemd[1]: session-50.scope: Deactivated successfully. Feb 13 08:11:39.720345 systemd-logind[1446]: Session 50 logged out. Waiting for processes to exit. Feb 13 08:11:39.720794 systemd-logind[1446]: Removed session 50. Feb 13 08:11:39.805970 kernel: audit: type=1101 audit(1707811899.620:1728): pid=13040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.806008 kernel: audit: type=1103 audit(1707811899.622:1729): pid=13040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.806026 kernel: audit: type=1006 audit(1707811899.622:1730): pid=13040 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=50 res=1 Feb 13 08:11:39.864617 kernel: audit: type=1300 audit(1707811899.622:1730): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6d58d200 a2=3 a3=0 items=0 ppid=1 pid=13040 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:39.622000 audit[13040]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6d58d200 a2=3 a3=0 items=0 ppid=1 pid=13040 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:39.956703 kernel: audit: type=1327 audit(1707811899.622:1730): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:39.622000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:39.987150 kernel: audit: type=1105 audit(1707811899.627:1731): pid=13040 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.627000 audit[13040]: USER_START pid=13040 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:40.081622 kernel: audit: type=1103 audit(1707811899.627:1732): pid=13042 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.627000 audit[13042]: CRED_ACQ pid=13042 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:40.170997 kernel: audit: type=1106 audit(1707811899.717:1733): pid=13040 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.717000 audit[13040]: USER_END pid=13040 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:40.266514 kernel: audit: type=1104 audit(1707811899.717:1734): pid=13040 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.717000 audit[13040]: CRED_DISP pid=13040 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:39.717000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-145.40.90.207:22-139.178.68.195:56832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:41.707056 env[1458]: time="2024-02-13T08:11:41.706956035Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:11:41.731192 env[1458]: time="2024-02-13T08:11:41.731159136Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:41.731351 kubelet[2569]: E0213 08:11:41.731309 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:11:41.731351 kubelet[2569]: E0213 08:11:41.731333 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:11:41.731528 kubelet[2569]: E0213 08:11:41.731354 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:41.731528 kubelet[2569]: E0213 08:11:41.731372 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:11:42.705853 env[1458]: time="2024-02-13T08:11:42.705809530Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:11:42.723944 env[1458]: time="2024-02-13T08:11:42.723881841Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:42.724149 kubelet[2569]: E0213 08:11:42.724064 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:11:42.724149 kubelet[2569]: E0213 08:11:42.724091 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:11:42.724149 kubelet[2569]: E0213 08:11:42.724113 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:42.724149 kubelet[2569]: E0213 08:11:42.724132 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:11:44.729188 systemd[1]: Started sshd@53-145.40.90.207:22-139.178.68.195:56840.service. Feb 13 08:11:44.727000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-145.40.90.207:22-139.178.68.195:56840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:44.769093 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:11:44.769155 kernel: audit: type=1130 audit(1707811904.727:1736): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-145.40.90.207:22-139.178.68.195:56840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:44.885000 audit[13152]: USER_ACCT pid=13152 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:44.886748 sshd[13152]: Accepted publickey for core from 139.178.68.195 port 56840 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:44.887911 sshd[13152]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:44.890241 systemd-logind[1446]: New session 51 of user core. Feb 13 08:11:44.890951 systemd[1]: Started session-51.scope. Feb 13 08:11:44.886000 audit[13152]: CRED_ACQ pid=13152 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:44.981657 sshd[13152]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:44.983108 systemd[1]: sshd@53-145.40.90.207:22-139.178.68.195:56840.service: Deactivated successfully. Feb 13 08:11:44.983578 systemd[1]: session-51.scope: Deactivated successfully. Feb 13 08:11:44.983963 systemd-logind[1446]: Session 51 logged out. Waiting for processes to exit. Feb 13 08:11:44.984401 systemd-logind[1446]: Removed session 51. Feb 13 08:11:45.069807 kernel: audit: type=1101 audit(1707811904.885:1737): pid=13152 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:45.069899 kernel: audit: type=1103 audit(1707811904.886:1738): pid=13152 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:45.069916 kernel: audit: type=1006 audit(1707811904.886:1739): pid=13152 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=51 res=1 Feb 13 08:11:45.128447 kernel: audit: type=1300 audit(1707811904.886:1739): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffda0088740 a2=3 a3=0 items=0 ppid=1 pid=13152 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:44.886000 audit[13152]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffda0088740 a2=3 a3=0 items=0 ppid=1 pid=13152 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:44.886000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:45.250964 kernel: audit: type=1327 audit(1707811904.886:1739): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:45.251004 kernel: audit: type=1105 audit(1707811904.891:1740): pid=13152 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:44.891000 audit[13152]: USER_START pid=13152 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:45.345411 kernel: audit: type=1103 audit(1707811904.892:1741): pid=13154 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:44.892000 audit[13154]: CRED_ACQ pid=13154 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:45.434645 kernel: audit: type=1106 audit(1707811904.980:1742): pid=13152 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:44.980000 audit[13152]: USER_END pid=13152 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:44.980000 audit[13152]: CRED_DISP pid=13152 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:45.619449 kernel: audit: type=1104 audit(1707811904.980:1743): pid=13152 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:44.981000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-145.40.90.207:22-139.178.68.195:56840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:46.121000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:46.121000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001b22900 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:11:46.121000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:11:46.121000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:46.121000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000f89530 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:11:46.121000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:11:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c01602f890 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:11:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:11:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c01604bd70 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:11:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:11:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e5212e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:11:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:11:46.933000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:46.933000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c01a96b590 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:11:46.933000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:11:46.933000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:46.933000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c01a96b5c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:11:46.933000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:11:46.933000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:46.933000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0028c7020 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:11:46.933000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:11:49.993750 systemd[1]: Started sshd@54-145.40.90.207:22-139.178.68.195:41294.service. Feb 13 08:11:49.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-145.40.90.207:22-139.178.68.195:41294 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:50.021377 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:11:50.021435 kernel: audit: type=1130 audit(1707811909.992:1753): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-145.40.90.207:22-139.178.68.195:41294 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:50.138000 audit[13179]: USER_ACCT pid=13179 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.139169 sshd[13179]: Accepted publickey for core from 139.178.68.195 port 41294 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:50.140377 sshd[13179]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:50.142979 systemd-logind[1446]: New session 52 of user core. Feb 13 08:11:50.143484 systemd[1]: Started session-52.scope. Feb 13 08:11:50.138000 audit[13179]: CRED_ACQ pid=13179 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.233085 sshd[13179]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:50.234356 systemd[1]: sshd@54-145.40.90.207:22-139.178.68.195:41294.service: Deactivated successfully. Feb 13 08:11:50.234788 systemd[1]: session-52.scope: Deactivated successfully. Feb 13 08:11:50.235206 systemd-logind[1446]: Session 52 logged out. Waiting for processes to exit. Feb 13 08:11:50.235627 systemd-logind[1446]: Removed session 52. Feb 13 08:11:50.321084 kernel: audit: type=1101 audit(1707811910.138:1754): pid=13179 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.321126 kernel: audit: type=1103 audit(1707811910.138:1755): pid=13179 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.321143 kernel: audit: type=1006 audit(1707811910.138:1756): pid=13179 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=52 res=1 Feb 13 08:11:50.379709 kernel: audit: type=1300 audit(1707811910.138:1756): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffde35a9c60 a2=3 a3=0 items=0 ppid=1 pid=13179 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:50.138000 audit[13179]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffde35a9c60 a2=3 a3=0 items=0 ppid=1 pid=13179 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:50.471705 kernel: audit: type=1327 audit(1707811910.138:1756): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:50.138000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:50.502177 kernel: audit: type=1105 audit(1707811910.144:1757): pid=13179 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.144000 audit[13179]: USER_START pid=13179 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.596612 kernel: audit: type=1103 audit(1707811910.144:1758): pid=13181 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.144000 audit[13181]: CRED_ACQ pid=13181 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.685809 kernel: audit: type=1106 audit(1707811910.231:1759): pid=13179 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.231000 audit[13179]: USER_END pid=13179 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.688043 systemd[1]: Started sshd@55-145.40.90.207:22-218.92.0.45:12030.service. Feb 13 08:11:50.705859 env[1458]: time="2024-02-13T08:11:50.705838199Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:11:50.706016 env[1458]: time="2024-02-13T08:11:50.705838189Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:11:50.722160 env[1458]: time="2024-02-13T08:11:50.722081655Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:50.722332 kubelet[2569]: E0213 08:11:50.722291 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:11:50.722332 kubelet[2569]: E0213 08:11:50.722320 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:11:50.722518 env[1458]: time="2024-02-13T08:11:50.722301019Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:50.722548 kubelet[2569]: E0213 08:11:50.722354 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:50.722548 kubelet[2569]: E0213 08:11:50.722372 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:11:50.722548 kubelet[2569]: E0213 08:11:50.722407 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:11:50.722548 kubelet[2569]: E0213 08:11:50.722420 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:11:50.722679 kubelet[2569]: E0213 08:11:50.722439 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:50.722679 kubelet[2569]: E0213 08:11:50.722453 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:11:50.781326 kernel: audit: type=1104 audit(1707811910.232:1760): pid=13179 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.232000 audit[13179]: CRED_DISP pid=13179 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:50.232000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-145.40.90.207:22-139.178.68.195:41294 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:50.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-145.40.90.207:22-218.92.0.45:12030 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:50.870868 sshd[13203]: Unable to negotiate with 218.92.0.45 port 12030: no matching key exchange method found. Their offer: diffie-hellman-group1-sha1,diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha1 [preauth] Feb 13 08:11:50.871246 systemd[1]: sshd@55-145.40.90.207:22-218.92.0.45:12030.service: Deactivated successfully. Feb 13 08:11:50.869000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-145.40.90.207:22-218.92.0.45:12030 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:50.872000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:50.872000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001d0bb00 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:11:50.872000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:11:50.877000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:50.877000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001b22920 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:11:50.877000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:11:50.877000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:50.877000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001b22940 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:11:50.877000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:11:50.880000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:11:50.880000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001d0bc20 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:11:50.880000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:11:52.706807 env[1458]: time="2024-02-13T08:11:52.706714451Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:11:52.736149 env[1458]: time="2024-02-13T08:11:52.736063778Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:52.736420 kubelet[2569]: E0213 08:11:52.736343 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:11:52.736420 kubelet[2569]: E0213 08:11:52.736397 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:11:52.736602 kubelet[2569]: E0213 08:11:52.736434 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:52.736602 kubelet[2569]: E0213 08:11:52.736453 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:11:54.706834 env[1458]: time="2024-02-13T08:11:54.706694177Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:11:54.759612 env[1458]: time="2024-02-13T08:11:54.759511516Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:11:54.759851 kubelet[2569]: E0213 08:11:54.759824 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:11:54.760251 kubelet[2569]: E0213 08:11:54.759874 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:11:54.760251 kubelet[2569]: E0213 08:11:54.759932 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:11:54.760251 kubelet[2569]: E0213 08:11:54.759975 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:11:55.242334 systemd[1]: Started sshd@56-145.40.90.207:22-139.178.68.195:41302.service. Feb 13 08:11:55.240000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.90.207:22-139.178.68.195:41302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:55.269403 kernel: kauditd_printk_skb: 15 callbacks suppressed Feb 13 08:11:55.269485 kernel: audit: type=1130 audit(1707811915.240:1768): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.90.207:22-139.178.68.195:41302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:55.385000 audit[13326]: USER_ACCT pid=13326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.387546 sshd[13326]: Accepted publickey for core from 139.178.68.195 port 41302 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:11:55.388927 sshd[13326]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:11:55.391218 systemd-logind[1446]: New session 53 of user core. Feb 13 08:11:55.391705 systemd[1]: Started session-53.scope. Feb 13 08:11:55.471673 sshd[13326]: pam_unix(sshd:session): session closed for user core Feb 13 08:11:55.473202 systemd[1]: sshd@56-145.40.90.207:22-139.178.68.195:41302.service: Deactivated successfully. Feb 13 08:11:55.473627 systemd[1]: session-53.scope: Deactivated successfully. Feb 13 08:11:55.474055 systemd-logind[1446]: Session 53 logged out. Waiting for processes to exit. Feb 13 08:11:55.474526 systemd-logind[1446]: Removed session 53. Feb 13 08:11:55.387000 audit[13326]: CRED_ACQ pid=13326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.569429 kernel: audit: type=1101 audit(1707811915.385:1769): pid=13326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.569473 kernel: audit: type=1103 audit(1707811915.387:1770): pid=13326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.569491 kernel: audit: type=1006 audit(1707811915.387:1771): pid=13326 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=53 res=1 Feb 13 08:11:55.628080 kernel: audit: type=1300 audit(1707811915.387:1771): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffedfd2bb20 a2=3 a3=0 items=0 ppid=1 pid=13326 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:55.387000 audit[13326]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffedfd2bb20 a2=3 a3=0 items=0 ppid=1 pid=13326 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:11:55.720102 kernel: audit: type=1327 audit(1707811915.387:1771): proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:55.387000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:11:55.750583 kernel: audit: type=1105 audit(1707811915.392:1772): pid=13326 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.392000 audit[13326]: USER_START pid=13326 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.845080 kernel: audit: type=1103 audit(1707811915.392:1773): pid=13328 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.392000 audit[13328]: CRED_ACQ pid=13328 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.894660 systemd[1]: Started sshd@57-145.40.90.207:22-202.188.109.48:44552.service. Feb 13 08:11:55.470000 audit[13326]: USER_END pid=13326 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:56.029887 kernel: audit: type=1106 audit(1707811915.470:1774): pid=13326 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:56.029929 kernel: audit: type=1104 audit(1707811915.470:1775): pid=13326 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.470000 audit[13326]: CRED_DISP pid=13326 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:11:55.471000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.90.207:22-139.178.68.195:41302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:55.893000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-145.40.90.207:22-202.188.109.48:44552 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:11:57.063603 sshd[13350]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=202.188.109.48 user=root Feb 13 08:11:57.062000 audit[13350]: USER_AUTH pid=13350 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=202.188.109.48 addr=202.188.109.48 terminal=ssh res=failed' Feb 13 08:11:58.533675 sshd[13350]: Failed password for root from 202.188.109.48 port 44552 ssh2 Feb 13 08:11:58.986885 sshd[13350]: Received disconnect from 202.188.109.48 port 44552:11: Bye Bye [preauth] Feb 13 08:11:58.986885 sshd[13350]: Disconnected from authenticating user root 202.188.109.48 port 44552 [preauth] Feb 13 08:11:58.989232 systemd[1]: sshd@57-145.40.90.207:22-202.188.109.48:44552.service: Deactivated successfully. Feb 13 08:11:58.988000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-145.40.90.207:22-202.188.109.48:44552 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:00.478476 systemd[1]: Started sshd@58-145.40.90.207:22-139.178.68.195:53284.service. Feb 13 08:12:00.477000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-145.40.90.207:22-139.178.68.195:53284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:00.505457 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:12:00.505522 kernel: audit: type=1130 audit(1707811920.477:1780): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-145.40.90.207:22-139.178.68.195:53284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:00.620000 audit[13354]: USER_ACCT pid=13354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.621941 sshd[13354]: Accepted publickey for core from 139.178.68.195 port 53284 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:00.623943 sshd[13354]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:00.626429 systemd-logind[1446]: New session 54 of user core. Feb 13 08:12:00.626899 systemd[1]: Started session-54.scope. Feb 13 08:12:00.705756 sshd[13354]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:00.707257 systemd[1]: sshd@58-145.40.90.207:22-139.178.68.195:53284.service: Deactivated successfully. Feb 13 08:12:00.707736 systemd[1]: session-54.scope: Deactivated successfully. Feb 13 08:12:00.708110 systemd-logind[1446]: Session 54 logged out. Waiting for processes to exit. Feb 13 08:12:00.708514 systemd-logind[1446]: Removed session 54. Feb 13 08:12:00.622000 audit[13354]: CRED_ACQ pid=13354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.803929 kernel: audit: type=1101 audit(1707811920.620:1781): pid=13354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.803968 kernel: audit: type=1103 audit(1707811920.622:1782): pid=13354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.803986 kernel: audit: type=1006 audit(1707811920.622:1783): pid=13354 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=54 res=1 Feb 13 08:12:00.862563 kernel: audit: type=1300 audit(1707811920.622:1783): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc40f61990 a2=3 a3=0 items=0 ppid=1 pid=13354 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:00.622000 audit[13354]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc40f61990 a2=3 a3=0 items=0 ppid=1 pid=13354 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:00.954586 kernel: audit: type=1327 audit(1707811920.622:1783): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:00.622000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:00.985059 kernel: audit: type=1105 audit(1707811920.627:1784): pid=13354 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.627000 audit[13354]: USER_START pid=13354 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:01.079552 kernel: audit: type=1103 audit(1707811920.627:1785): pid=13356 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.627000 audit[13356]: CRED_ACQ pid=13356 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:01.168715 kernel: audit: type=1106 audit(1707811920.704:1786): pid=13354 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.704000 audit[13354]: USER_END pid=13354 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:01.264213 kernel: audit: type=1104 audit(1707811920.704:1787): pid=13354 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.704000 audit[13354]: CRED_DISP pid=13354 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:00.705000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-145.40.90.207:22-139.178.68.195:53284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:03.706555 env[1458]: time="2024-02-13T08:12:03.706456223Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:12:03.732588 env[1458]: time="2024-02-13T08:12:03.732553271Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:03.732809 kubelet[2569]: E0213 08:12:03.732796 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:12:03.733002 kubelet[2569]: E0213 08:12:03.732826 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:12:03.733002 kubelet[2569]: E0213 08:12:03.732858 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:03.733002 kubelet[2569]: E0213 08:12:03.732886 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:12:04.706677 env[1458]: time="2024-02-13T08:12:04.706531623Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:12:04.733519 env[1458]: time="2024-02-13T08:12:04.733482447Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:04.733776 kubelet[2569]: E0213 08:12:04.733737 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:12:04.733776 kubelet[2569]: E0213 08:12:04.733761 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:12:04.733960 kubelet[2569]: E0213 08:12:04.733783 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:04.733960 kubelet[2569]: E0213 08:12:04.733800 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:12:05.706845 env[1458]: time="2024-02-13T08:12:05.706747895Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:12:05.715184 systemd[1]: Started sshd@59-145.40.90.207:22-139.178.68.195:53292.service. Feb 13 08:12:05.713000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-145.40.90.207:22-139.178.68.195:53292 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:05.725440 env[1458]: time="2024-02-13T08:12:05.725363284Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:05.725603 kubelet[2569]: E0213 08:12:05.725591 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:12:05.725697 kubelet[2569]: E0213 08:12:05.725621 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:12:05.725697 kubelet[2569]: E0213 08:12:05.725690 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:05.725770 kubelet[2569]: E0213 08:12:05.725712 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:12:05.742217 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:12:05.742277 kernel: audit: type=1130 audit(1707811925.713:1789): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-145.40.90.207:22-139.178.68.195:53292 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:05.857932 sshd[13451]: Accepted publickey for core from 139.178.68.195 port 53292 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:05.857000 audit[13451]: USER_ACCT pid=13451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:05.858919 sshd[13451]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:05.861521 systemd-logind[1446]: New session 55 of user core. Feb 13 08:12:05.861967 systemd[1]: Started session-55.scope. Feb 13 08:12:05.941687 sshd[13451]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:05.943371 systemd[1]: sshd@59-145.40.90.207:22-139.178.68.195:53292.service: Deactivated successfully. Feb 13 08:12:05.943808 systemd[1]: session-55.scope: Deactivated successfully. Feb 13 08:12:05.944290 systemd-logind[1446]: Session 55 logged out. Waiting for processes to exit. Feb 13 08:12:05.944908 systemd-logind[1446]: Removed session 55. Feb 13 08:12:05.857000 audit[13451]: CRED_ACQ pid=13451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:06.039954 kernel: audit: type=1101 audit(1707811925.857:1790): pid=13451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:06.039994 kernel: audit: type=1103 audit(1707811925.857:1791): pid=13451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:06.040012 kernel: audit: type=1006 audit(1707811925.857:1792): pid=13451 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=55 res=1 Feb 13 08:12:06.098559 kernel: audit: type=1300 audit(1707811925.857:1792): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa1766640 a2=3 a3=0 items=0 ppid=1 pid=13451 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:05.857000 audit[13451]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa1766640 a2=3 a3=0 items=0 ppid=1 pid=13451 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:06.190558 kernel: audit: type=1327 audit(1707811925.857:1792): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:05.857000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:06.221005 kernel: audit: type=1105 audit(1707811925.863:1793): pid=13451 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:05.863000 audit[13451]: USER_START pid=13451 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:06.315448 kernel: audit: type=1103 audit(1707811925.863:1794): pid=13470 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:05.863000 audit[13470]: CRED_ACQ pid=13470 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:06.404716 kernel: audit: type=1106 audit(1707811925.940:1795): pid=13451 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:05.940000 audit[13451]: USER_END pid=13451 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:06.500186 kernel: audit: type=1104 audit(1707811925.941:1796): pid=13451 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:05.941000 audit[13451]: CRED_DISP pid=13451 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:05.941000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-145.40.90.207:22-139.178.68.195:53292 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:06.706828 env[1458]: time="2024-02-13T08:12:06.706711324Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:12:06.734103 env[1458]: time="2024-02-13T08:12:06.734034688Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:06.734331 kubelet[2569]: E0213 08:12:06.734233 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:12:06.734331 kubelet[2569]: E0213 08:12:06.734260 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:12:06.734331 kubelet[2569]: E0213 08:12:06.734285 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:06.734331 kubelet[2569]: E0213 08:12:06.734303 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:12:10.953135 systemd[1]: Started sshd@60-145.40.90.207:22-139.178.68.195:53152.service. Feb 13 08:12:10.951000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-145.40.90.207:22-139.178.68.195:53152 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:10.980491 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:12:10.980520 kernel: audit: type=1130 audit(1707811930.951:1798): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-145.40.90.207:22-139.178.68.195:53152 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:11.097000 audit[13519]: USER_ACCT pid=13519 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.098714 sshd[13519]: Accepted publickey for core from 139.178.68.195 port 53152 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:11.099975 sshd[13519]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:11.102228 systemd-logind[1446]: New session 56 of user core. Feb 13 08:12:11.102903 systemd[1]: Started session-56.scope. Feb 13 08:12:11.183061 sshd[13519]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:11.184336 systemd[1]: sshd@60-145.40.90.207:22-139.178.68.195:53152.service: Deactivated successfully. Feb 13 08:12:11.184810 systemd[1]: session-56.scope: Deactivated successfully. Feb 13 08:12:11.185181 systemd-logind[1446]: Session 56 logged out. Waiting for processes to exit. Feb 13 08:12:11.185595 systemd-logind[1446]: Removed session 56. Feb 13 08:12:11.098000 audit[13519]: CRED_ACQ pid=13519 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.280522 kernel: audit: type=1101 audit(1707811931.097:1799): pid=13519 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.280581 kernel: audit: type=1103 audit(1707811931.098:1800): pid=13519 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.280603 kernel: audit: type=1006 audit(1707811931.098:1801): pid=13519 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=56 res=1 Feb 13 08:12:11.339155 kernel: audit: type=1300 audit(1707811931.098:1801): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff984bf760 a2=3 a3=0 items=0 ppid=1 pid=13519 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:11.098000 audit[13519]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff984bf760 a2=3 a3=0 items=0 ppid=1 pid=13519 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:11.431247 kernel: audit: type=1327 audit(1707811931.098:1801): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:11.098000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:11.461709 kernel: audit: type=1105 audit(1707811931.103:1802): pid=13519 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.103000 audit[13519]: USER_START pid=13519 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.556203 kernel: audit: type=1103 audit(1707811931.104:1803): pid=13521 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.104000 audit[13521]: CRED_ACQ pid=13521 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.645459 kernel: audit: type=1106 audit(1707811931.181:1804): pid=13519 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.181000 audit[13519]: USER_END pid=13519 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.740997 kernel: audit: type=1104 audit(1707811931.182:1805): pid=13519 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.182000 audit[13519]: CRED_DISP pid=13519 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:11.182000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-145.40.90.207:22-139.178.68.195:53152 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:16.195674 systemd[1]: Started sshd@61-145.40.90.207:22-139.178.68.195:35924.service. Feb 13 08:12:16.194000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-145.40.90.207:22-139.178.68.195:35924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:16.237682 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:12:16.237743 kernel: audit: type=1130 audit(1707811936.194:1807): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-145.40.90.207:22-139.178.68.195:35924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:16.353004 sshd[13544]: Accepted publickey for core from 139.178.68.195 port 35924 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:16.351000 audit[13544]: USER_ACCT pid=13544 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.353937 sshd[13544]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:16.356687 systemd-logind[1446]: New session 57 of user core. Feb 13 08:12:16.357209 systemd[1]: Started session-57.scope. Feb 13 08:12:16.436468 sshd[13544]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:16.437874 systemd[1]: sshd@61-145.40.90.207:22-139.178.68.195:35924.service: Deactivated successfully. Feb 13 08:12:16.438304 systemd[1]: session-57.scope: Deactivated successfully. Feb 13 08:12:16.438595 systemd-logind[1446]: Session 57 logged out. Waiting for processes to exit. Feb 13 08:12:16.439207 systemd-logind[1446]: Removed session 57. Feb 13 08:12:16.352000 audit[13544]: CRED_ACQ pid=13544 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.535047 kernel: audit: type=1101 audit(1707811936.351:1808): pid=13544 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.535097 kernel: audit: type=1103 audit(1707811936.352:1809): pid=13544 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.535117 kernel: audit: type=1006 audit(1707811936.352:1810): pid=13544 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=57 res=1 Feb 13 08:12:16.593709 kernel: audit: type=1300 audit(1707811936.352:1810): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffccf94e100 a2=3 a3=0 items=0 ppid=1 pid=13544 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:16.352000 audit[13544]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffccf94e100 a2=3 a3=0 items=0 ppid=1 pid=13544 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:16.352000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:16.705224 env[1458]: time="2024-02-13T08:12:16.705168608Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:12:16.716327 kernel: audit: type=1327 audit(1707811936.352:1810): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:16.716435 kernel: audit: type=1105 audit(1707811936.357:1811): pid=13544 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.357000 audit[13544]: USER_START pid=13544 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.718805 env[1458]: time="2024-02-13T08:12:16.718761231Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:16.718970 kubelet[2569]: E0213 08:12:16.718940 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:12:16.718970 kubelet[2569]: E0213 08:12:16.718966 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:12:16.719173 kubelet[2569]: E0213 08:12:16.718988 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:16.719173 kubelet[2569]: E0213 08:12:16.719006 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:12:16.358000 audit[13546]: CRED_ACQ pid=13546 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.900099 kernel: audit: type=1103 audit(1707811936.358:1812): pid=13546 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.900154 kernel: audit: type=1106 audit(1707811936.435:1813): pid=13544 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.435000 audit[13544]: USER_END pid=13544 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.995706 kernel: audit: type=1104 audit(1707811936.435:1814): pid=13544 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.435000 audit[13544]: CRED_DISP pid=13544 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:16.436000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-145.40.90.207:22-139.178.68.195:35924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:17.706692 env[1458]: time="2024-02-13T08:12:17.706546591Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:12:17.756678 env[1458]: time="2024-02-13T08:12:17.756594706Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:17.756945 kubelet[2569]: E0213 08:12:17.756882 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:12:17.756945 kubelet[2569]: E0213 08:12:17.756928 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:12:17.757403 kubelet[2569]: E0213 08:12:17.756978 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:17.757403 kubelet[2569]: E0213 08:12:17.757019 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:12:19.707097 env[1458]: time="2024-02-13T08:12:19.706966772Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:12:19.707097 env[1458]: time="2024-02-13T08:12:19.706995902Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:12:19.755628 env[1458]: time="2024-02-13T08:12:19.755568639Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:19.755870 kubelet[2569]: E0213 08:12:19.755842 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:12:19.756270 kubelet[2569]: E0213 08:12:19.755899 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:12:19.756270 kubelet[2569]: E0213 08:12:19.755964 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:19.756270 kubelet[2569]: E0213 08:12:19.756020 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:12:19.759054 env[1458]: time="2024-02-13T08:12:19.758982896Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:19.759188 kubelet[2569]: E0213 08:12:19.759168 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:12:19.759260 kubelet[2569]: E0213 08:12:19.759203 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:12:19.759260 kubelet[2569]: E0213 08:12:19.759250 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:19.759449 kubelet[2569]: E0213 08:12:19.759284 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:12:21.446364 systemd[1]: Started sshd@62-145.40.90.207:22-139.178.68.195:35930.service. Feb 13 08:12:21.445000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.90.207:22-139.178.68.195:35930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:21.473334 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:12:21.473406 kernel: audit: type=1130 audit(1707811941.445:1816): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.90.207:22-139.178.68.195:35930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:21.589000 audit[13687]: USER_ACCT pid=13687 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.591329 sshd[13687]: Accepted publickey for core from 139.178.68.195 port 35930 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:21.594380 sshd[13687]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:21.598599 systemd-logind[1446]: New session 58 of user core. Feb 13 08:12:21.599080 systemd[1]: Started session-58.scope. Feb 13 08:12:21.678979 sshd[13687]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:21.680341 systemd[1]: sshd@62-145.40.90.207:22-139.178.68.195:35930.service: Deactivated successfully. Feb 13 08:12:21.680775 systemd[1]: session-58.scope: Deactivated successfully. Feb 13 08:12:21.681170 systemd-logind[1446]: Session 58 logged out. Waiting for processes to exit. Feb 13 08:12:21.681575 systemd-logind[1446]: Removed session 58. Feb 13 08:12:21.592000 audit[13687]: CRED_ACQ pid=13687 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.772945 kernel: audit: type=1101 audit(1707811941.589:1817): pid=13687 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.772990 kernel: audit: type=1103 audit(1707811941.592:1818): pid=13687 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.773008 kernel: audit: type=1006 audit(1707811941.592:1819): pid=13687 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=58 res=1 Feb 13 08:12:21.831650 kernel: audit: type=1300 audit(1707811941.592:1819): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff930a2170 a2=3 a3=0 items=0 ppid=1 pid=13687 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:21.592000 audit[13687]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff930a2170 a2=3 a3=0 items=0 ppid=1 pid=13687 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:21.592000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:21.954061 kernel: audit: type=1327 audit(1707811941.592:1819): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:21.954092 kernel: audit: type=1105 audit(1707811941.599:1820): pid=13687 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.599000 audit[13687]: USER_START pid=13687 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:22.048440 kernel: audit: type=1103 audit(1707811941.599:1821): pid=13689 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.599000 audit[13689]: CRED_ACQ pid=13689 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:22.137548 kernel: audit: type=1106 audit(1707811941.677:1822): pid=13687 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.677000 audit[13687]: USER_END pid=13687 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:22.232992 kernel: audit: type=1104 audit(1707811941.678:1823): pid=13687 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.678000 audit[13687]: CRED_DISP pid=13687 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:21.678000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.90.207:22-139.178.68.195:35930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:25.788690 sshd[11993]: Timeout before authentication for 101.43.185.249 port 50204 Feb 13 08:12:25.790166 systemd[1]: sshd@35-145.40.90.207:22-101.43.185.249:50204.service: Deactivated successfully. Feb 13 08:12:25.789000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-145.40.90.207:22-101.43.185.249:50204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:26.688367 systemd[1]: Started sshd@63-145.40.90.207:22-139.178.68.195:36630.service. Feb 13 08:12:26.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.90.207:22-139.178.68.195:36630 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:26.714880 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:12:26.714924 kernel: audit: type=1130 audit(1707811946.686:1826): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.90.207:22-139.178.68.195:36630 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:26.830000 audit[13712]: USER_ACCT pid=13712 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:26.831920 sshd[13712]: Accepted publickey for core from 139.178.68.195 port 36630 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:26.834065 sshd[13712]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:26.838574 systemd-logind[1446]: New session 59 of user core. Feb 13 08:12:26.839055 systemd[1]: Started session-59.scope. Feb 13 08:12:26.918696 sshd[13712]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:26.920236 systemd[1]: sshd@63-145.40.90.207:22-139.178.68.195:36630.service: Deactivated successfully. Feb 13 08:12:26.920715 systemd[1]: session-59.scope: Deactivated successfully. Feb 13 08:12:26.921068 systemd-logind[1446]: Session 59 logged out. Waiting for processes to exit. Feb 13 08:12:26.921502 systemd-logind[1446]: Removed session 59. Feb 13 08:12:26.831000 audit[13712]: CRED_ACQ pid=13712 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:27.015408 kernel: audit: type=1101 audit(1707811946.830:1827): pid=13712 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:27.015447 kernel: audit: type=1103 audit(1707811946.831:1828): pid=13712 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:27.015468 kernel: audit: type=1006 audit(1707811946.832:1829): pid=13712 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=59 res=1 Feb 13 08:12:27.074002 kernel: audit: type=1300 audit(1707811946.832:1829): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc43097f0 a2=3 a3=0 items=0 ppid=1 pid=13712 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:26.832000 audit[13712]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc43097f0 a2=3 a3=0 items=0 ppid=1 pid=13712 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:27.166067 kernel: audit: type=1327 audit(1707811946.832:1829): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:26.832000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:27.196541 kernel: audit: type=1105 audit(1707811946.839:1830): pid=13712 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:26.839000 audit[13712]: USER_START pid=13712 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:27.290986 kernel: audit: type=1103 audit(1707811946.840:1831): pid=13714 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:26.840000 audit[13714]: CRED_ACQ pid=13714 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:27.380168 kernel: audit: type=1106 audit(1707811946.917:1832): pid=13712 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:26.917000 audit[13712]: USER_END pid=13712 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:27.475628 kernel: audit: type=1104 audit(1707811946.917:1833): pid=13712 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:26.917000 audit[13712]: CRED_DISP pid=13712 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:26.918000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.90.207:22-139.178.68.195:36630 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:28.706917 env[1458]: time="2024-02-13T08:12:28.706787185Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:12:28.733677 env[1458]: time="2024-02-13T08:12:28.733606286Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:28.733851 kubelet[2569]: E0213 08:12:28.733810 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:12:28.733851 kubelet[2569]: E0213 08:12:28.733834 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:12:28.734049 kubelet[2569]: E0213 08:12:28.733856 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:28.734049 kubelet[2569]: E0213 08:12:28.733874 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:12:29.705858 env[1458]: time="2024-02-13T08:12:29.705833902Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:12:29.718579 env[1458]: time="2024-02-13T08:12:29.718541477Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:29.718837 kubelet[2569]: E0213 08:12:29.718698 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:12:29.718837 kubelet[2569]: E0213 08:12:29.718725 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:12:29.718837 kubelet[2569]: E0213 08:12:29.718751 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:29.718837 kubelet[2569]: E0213 08:12:29.718770 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:12:30.707105 env[1458]: time="2024-02-13T08:12:30.706975040Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:12:30.733320 env[1458]: time="2024-02-13T08:12:30.733286301Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:30.733628 kubelet[2569]: E0213 08:12:30.733517 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:12:30.733628 kubelet[2569]: E0213 08:12:30.733543 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:12:30.733628 kubelet[2569]: E0213 08:12:30.733565 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:30.733628 kubelet[2569]: E0213 08:12:30.733583 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:12:31.927950 systemd[1]: Started sshd@64-145.40.90.207:22-139.178.68.195:36634.service. Feb 13 08:12:31.926000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-145.40.90.207:22-139.178.68.195:36634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:31.954723 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:12:31.954786 kernel: audit: type=1130 audit(1707811951.926:1835): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-145.40.90.207:22-139.178.68.195:36634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:32.071000 audit[13822]: USER_ACCT pid=13822 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.072741 sshd[13822]: Accepted publickey for core from 139.178.68.195 port 36634 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:32.073934 sshd[13822]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:32.076269 systemd-logind[1446]: New session 60 of user core. Feb 13 08:12:32.076762 systemd[1]: Started session-60.scope. Feb 13 08:12:32.156737 sshd[13822]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:32.158251 systemd[1]: sshd@64-145.40.90.207:22-139.178.68.195:36634.service: Deactivated successfully. Feb 13 08:12:32.158688 systemd[1]: session-60.scope: Deactivated successfully. Feb 13 08:12:32.159050 systemd-logind[1446]: Session 60 logged out. Waiting for processes to exit. Feb 13 08:12:32.159521 systemd-logind[1446]: Removed session 60. Feb 13 08:12:32.072000 audit[13822]: CRED_ACQ pid=13822 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.256097 kernel: audit: type=1101 audit(1707811952.071:1836): pid=13822 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.256139 kernel: audit: type=1103 audit(1707811952.072:1837): pid=13822 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.256161 kernel: audit: type=1006 audit(1707811952.072:1838): pid=13822 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=60 res=1 Feb 13 08:12:32.314712 kernel: audit: type=1300 audit(1707811952.072:1838): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd45a11c70 a2=3 a3=0 items=0 ppid=1 pid=13822 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:32.072000 audit[13822]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd45a11c70 a2=3 a3=0 items=0 ppid=1 pid=13822 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:32.406709 kernel: audit: type=1327 audit(1707811952.072:1838): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:32.072000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:32.437183 kernel: audit: type=1105 audit(1707811952.077:1839): pid=13822 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.077000 audit[13822]: USER_START pid=13822 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.531708 kernel: audit: type=1103 audit(1707811952.078:1840): pid=13824 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.078000 audit[13824]: CRED_ACQ pid=13824 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.620906 kernel: audit: type=1106 audit(1707811952.156:1841): pid=13822 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.156000 audit[13822]: USER_END pid=13822 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.705909 env[1458]: time="2024-02-13T08:12:32.705888406Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:12:32.156000 audit[13822]: CRED_DISP pid=13822 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.717593 env[1458]: time="2024-02-13T08:12:32.717562222Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:32.717756 kubelet[2569]: E0213 08:12:32.717717 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:12:32.717756 kubelet[2569]: E0213 08:12:32.717742 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:12:32.717951 kubelet[2569]: E0213 08:12:32.717764 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:32.717951 kubelet[2569]: E0213 08:12:32.717782 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:12:32.805779 kernel: audit: type=1104 audit(1707811952.156:1842): pid=13822 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:32.157000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-145.40.90.207:22-139.178.68.195:36634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:37.166959 systemd[1]: Started sshd@65-145.40.90.207:22-139.178.68.195:48460.service. Feb 13 08:12:37.165000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-145.40.90.207:22-139.178.68.195:48460 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:37.193871 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:12:37.193920 kernel: audit: type=1130 audit(1707811957.165:1844): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-145.40.90.207:22-139.178.68.195:48460 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:37.310000 audit[13875]: USER_ACCT pid=13875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.311022 sshd[13875]: Accepted publickey for core from 139.178.68.195 port 48460 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:37.314235 sshd[13875]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:37.318716 systemd-logind[1446]: New session 61 of user core. Feb 13 08:12:37.319814 systemd[1]: Started session-61.scope. Feb 13 08:12:37.312000 audit[13875]: CRED_ACQ pid=13875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.403273 sshd[13875]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:37.404686 systemd[1]: sshd@65-145.40.90.207:22-139.178.68.195:48460.service: Deactivated successfully. Feb 13 08:12:37.405133 systemd[1]: session-61.scope: Deactivated successfully. Feb 13 08:12:37.405429 systemd-logind[1446]: Session 61 logged out. Waiting for processes to exit. Feb 13 08:12:37.405793 systemd-logind[1446]: Removed session 61. Feb 13 08:12:37.492791 kernel: audit: type=1101 audit(1707811957.310:1845): pid=13875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.492835 kernel: audit: type=1103 audit(1707811957.312:1846): pid=13875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.492852 kernel: audit: type=1006 audit(1707811957.312:1847): pid=13875 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=61 res=1 Feb 13 08:12:37.551434 kernel: audit: type=1300 audit(1707811957.312:1847): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffbefc3b90 a2=3 a3=0 items=0 ppid=1 pid=13875 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:37.312000 audit[13875]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffbefc3b90 a2=3 a3=0 items=0 ppid=1 pid=13875 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:37.643441 kernel: audit: type=1327 audit(1707811957.312:1847): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:37.312000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:37.673928 kernel: audit: type=1105 audit(1707811957.323:1848): pid=13875 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.323000 audit[13875]: USER_START pid=13875 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.768420 kernel: audit: type=1103 audit(1707811957.325:1849): pid=13877 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.325000 audit[13877]: CRED_ACQ pid=13877 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.403000 audit[13875]: USER_END pid=13875 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.403000 audit[13875]: CRED_DISP pid=13875 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:38.042439 kernel: audit: type=1106 audit(1707811957.403:1850): pid=13875 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:38.042478 kernel: audit: type=1104 audit(1707811957.403:1851): pid=13875 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:37.404000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-145.40.90.207:22-139.178.68.195:48460 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:39.707524 env[1458]: time="2024-02-13T08:12:39.707376483Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:12:39.757924 env[1458]: time="2024-02-13T08:12:39.757832454Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:39.758117 kubelet[2569]: E0213 08:12:39.758088 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:12:39.758456 kubelet[2569]: E0213 08:12:39.758132 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:12:39.758456 kubelet[2569]: E0213 08:12:39.758177 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:39.758456 kubelet[2569]: E0213 08:12:39.758211 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:12:40.705055 env[1458]: time="2024-02-13T08:12:40.705028442Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:12:40.719373 env[1458]: time="2024-02-13T08:12:40.719331002Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:40.719640 kubelet[2569]: E0213 08:12:40.719524 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:12:40.719640 kubelet[2569]: E0213 08:12:40.719559 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:12:40.719640 kubelet[2569]: E0213 08:12:40.719583 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:40.719640 kubelet[2569]: E0213 08:12:40.719603 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:12:41.706190 env[1458]: time="2024-02-13T08:12:41.706056614Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:12:41.758438 env[1458]: time="2024-02-13T08:12:41.758336930Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:41.758877 kubelet[2569]: E0213 08:12:41.758616 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:12:41.758877 kubelet[2569]: E0213 08:12:41.758691 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:12:41.758877 kubelet[2569]: E0213 08:12:41.758737 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:41.758877 kubelet[2569]: E0213 08:12:41.758771 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:12:42.413280 systemd[1]: Started sshd@66-145.40.90.207:22-139.178.68.195:48470.service. Feb 13 08:12:42.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.90.207:22-139.178.68.195:48470 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:42.440150 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:12:42.440243 kernel: audit: type=1130 audit(1707811962.411:1853): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.90.207:22-139.178.68.195:48470 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:42.557000 audit[13989]: USER_ACCT pid=13989 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:42.558743 sshd[13989]: Accepted publickey for core from 139.178.68.195 port 48470 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:42.562928 sshd[13989]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:42.565274 systemd-logind[1446]: New session 62 of user core. Feb 13 08:12:42.565904 systemd[1]: Started session-62.scope. Feb 13 08:12:42.647179 sshd[13989]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:42.648767 systemd[1]: sshd@66-145.40.90.207:22-139.178.68.195:48470.service: Deactivated successfully. Feb 13 08:12:42.649177 systemd[1]: session-62.scope: Deactivated successfully. Feb 13 08:12:42.649512 systemd-logind[1446]: Session 62 logged out. Waiting for processes to exit. Feb 13 08:12:42.650113 systemd-logind[1446]: Removed session 62. Feb 13 08:12:42.561000 audit[13989]: CRED_ACQ pid=13989 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:42.742669 kernel: audit: type=1101 audit(1707811962.557:1854): pid=13989 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:42.742728 kernel: audit: type=1103 audit(1707811962.561:1855): pid=13989 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:42.742745 kernel: audit: type=1006 audit(1707811962.561:1856): pid=13989 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=62 res=1 Feb 13 08:12:42.561000 audit[13989]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd79047f00 a2=3 a3=0 items=0 ppid=1 pid=13989 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:42.893268 kernel: audit: type=1300 audit(1707811962.561:1856): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd79047f00 a2=3 a3=0 items=0 ppid=1 pid=13989 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:42.893346 kernel: audit: type=1327 audit(1707811962.561:1856): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:42.561000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:42.566000 audit[13989]: USER_START pid=13989 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:43.018293 kernel: audit: type=1105 audit(1707811962.566:1857): pid=13989 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:43.018370 kernel: audit: type=1103 audit(1707811962.566:1858): pid=13991 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:42.566000 audit[13991]: CRED_ACQ pid=13991 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:43.107499 kernel: audit: type=1106 audit(1707811962.646:1859): pid=13989 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:42.646000 audit[13989]: USER_END pid=13989 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:43.203005 kernel: audit: type=1104 audit(1707811962.646:1860): pid=13989 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:42.646000 audit[13989]: CRED_DISP pid=13989 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:42.647000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.90.207:22-139.178.68.195:48470 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:44.706845 env[1458]: time="2024-02-13T08:12:44.706708112Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:12:44.753053 env[1458]: time="2024-02-13T08:12:44.752971657Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:44.753257 kubelet[2569]: E0213 08:12:44.753241 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:12:44.753434 kubelet[2569]: E0213 08:12:44.753272 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:12:44.753434 kubelet[2569]: E0213 08:12:44.753296 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:44.753434 kubelet[2569]: E0213 08:12:44.753316 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:12:46.122000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:46.122000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b238e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:12:46.122000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:12:46.122000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:46.122000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001b15170 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:12:46.122000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:12:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00a374c60 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:12:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:12:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=60 a1=c00a3b1440 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:12:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:12:46.190000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:46.190000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e7416e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:12:46.190000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:12:46.943000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:46.943000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:46.943000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:46.943000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=60 a1=c00ae94bd0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:12:46.943000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e69f860 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:12:46.943000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c00a3b1560 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:12:46.943000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:12:46.943000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:12:46.943000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:12:47.658291 systemd[1]: Started sshd@67-145.40.90.207:22-139.178.68.195:60674.service. Feb 13 08:12:47.657000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-145.40.90.207:22-139.178.68.195:60674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:47.698668 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:12:47.698788 kernel: audit: type=1130 audit(1707811967.657:1870): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-145.40.90.207:22-139.178.68.195:60674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:47.814917 sshd[14041]: Accepted publickey for core from 139.178.68.195 port 60674 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:47.813000 audit[14041]: USER_ACCT pid=14041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.817185 sshd[14041]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:47.819541 systemd-logind[1446]: New session 63 of user core. Feb 13 08:12:47.820078 systemd[1]: Started session-63.scope. Feb 13 08:12:47.902795 sshd[14041]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:47.904277 systemd[1]: sshd@67-145.40.90.207:22-139.178.68.195:60674.service: Deactivated successfully. Feb 13 08:12:47.904703 systemd[1]: session-63.scope: Deactivated successfully. Feb 13 08:12:47.905052 systemd-logind[1446]: Session 63 logged out. Waiting for processes to exit. Feb 13 08:12:47.905506 systemd-logind[1446]: Removed session 63. Feb 13 08:12:47.816000 audit[14041]: CRED_ACQ pid=14041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.996914 kernel: audit: type=1101 audit(1707811967.813:1871): pid=14041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.996961 kernel: audit: type=1103 audit(1707811967.816:1872): pid=14041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.996983 kernel: audit: type=1006 audit(1707811967.816:1873): pid=14041 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=63 res=1 Feb 13 08:12:48.055541 kernel: audit: type=1300 audit(1707811967.816:1873): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffce05ecb0 a2=3 a3=0 items=0 ppid=1 pid=14041 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:47.816000 audit[14041]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffce05ecb0 a2=3 a3=0 items=0 ppid=1 pid=14041 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:48.147518 kernel: audit: type=1327 audit(1707811967.816:1873): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:47.816000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:48.177977 kernel: audit: type=1105 audit(1707811967.820:1874): pid=14041 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.820000 audit[14041]: USER_START pid=14041 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.820000 audit[14043]: CRED_ACQ pid=14043 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:48.361707 kernel: audit: type=1103 audit(1707811967.820:1875): pid=14043 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:48.361754 kernel: audit: type=1106 audit(1707811967.902:1876): pid=14041 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.902000 audit[14041]: USER_END pid=14041 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:48.457246 kernel: audit: type=1104 audit(1707811967.902:1877): pid=14041 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.902000 audit[14041]: CRED_DISP pid=14041 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:47.903000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-145.40.90.207:22-139.178.68.195:60674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:50.706077 env[1458]: time="2024-02-13T08:12:50.705984083Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:12:50.735806 env[1458]: time="2024-02-13T08:12:50.735773393Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:50.735966 kubelet[2569]: E0213 08:12:50.735942 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:12:50.736132 kubelet[2569]: E0213 08:12:50.735981 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:12:50.736132 kubelet[2569]: E0213 08:12:50.736003 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:50.736132 kubelet[2569]: E0213 08:12:50.736019 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:12:50.875000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:50.875000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b23ae0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:12:50.875000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:12:50.877000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:50.877000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00015f4e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:12:50.877000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:12:50.877000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:50.877000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002c3ea00 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:12:50.877000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:12:50.882000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:12:50.882000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b23c80 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:12:50.882000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:12:52.706713 env[1458]: time="2024-02-13T08:12:52.706584578Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:12:52.733367 env[1458]: time="2024-02-13T08:12:52.733330734Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:52.733537 kubelet[2569]: E0213 08:12:52.733527 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:12:52.733736 kubelet[2569]: E0213 08:12:52.733553 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:12:52.733736 kubelet[2569]: E0213 08:12:52.733575 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:52.733736 kubelet[2569]: E0213 08:12:52.733601 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:12:52.911852 systemd[1]: Started sshd@68-145.40.90.207:22-139.178.68.195:60686.service. Feb 13 08:12:52.910000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.90.207:22-139.178.68.195:60686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:52.938807 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:12:52.938890 kernel: audit: type=1130 audit(1707811972.910:1883): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.90.207:22-139.178.68.195:60686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:53.056000 audit[14129]: USER_ACCT pid=14129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.057006 sshd[14129]: Accepted publickey for core from 139.178.68.195 port 60686 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:53.059930 sshd[14129]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:53.062315 systemd-logind[1446]: New session 64 of user core. Feb 13 08:12:53.062773 systemd[1]: Started session-64.scope. Feb 13 08:12:53.144960 sshd[14129]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:53.146310 systemd[1]: sshd@68-145.40.90.207:22-139.178.68.195:60686.service: Deactivated successfully. Feb 13 08:12:53.146738 systemd[1]: session-64.scope: Deactivated successfully. Feb 13 08:12:53.147083 systemd-logind[1446]: Session 64 logged out. Waiting for processes to exit. Feb 13 08:12:53.147511 systemd-logind[1446]: Removed session 64. Feb 13 08:12:53.058000 audit[14129]: CRED_ACQ pid=14129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.238906 kernel: audit: type=1101 audit(1707811973.056:1884): pid=14129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.238955 kernel: audit: type=1103 audit(1707811973.058:1885): pid=14129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.238977 kernel: audit: type=1006 audit(1707811973.058:1886): pid=14129 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=64 res=1 Feb 13 08:12:53.297570 kernel: audit: type=1300 audit(1707811973.058:1886): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdf3c60850 a2=3 a3=0 items=0 ppid=1 pid=14129 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:53.058000 audit[14129]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdf3c60850 a2=3 a3=0 items=0 ppid=1 pid=14129 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:53.389715 kernel: audit: type=1327 audit(1707811973.058:1886): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:53.058000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:53.420163 kernel: audit: type=1105 audit(1707811973.063:1887): pid=14129 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.063000 audit[14129]: USER_START pid=14129 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.514635 kernel: audit: type=1103 audit(1707811973.063:1888): pid=14131 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.063000 audit[14131]: CRED_ACQ pid=14131 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.603840 kernel: audit: type=1106 audit(1707811973.144:1889): pid=14129 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.144000 audit[14129]: USER_END pid=14129 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.699345 kernel: audit: type=1104 audit(1707811973.144:1890): pid=14129 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.144000 audit[14129]: CRED_DISP pid=14129 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:53.705049 env[1458]: time="2024-02-13T08:12:53.704997434Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:12:53.716748 env[1458]: time="2024-02-13T08:12:53.716682025Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:53.716971 kubelet[2569]: E0213 08:12:53.716870 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:12:53.716971 kubelet[2569]: E0213 08:12:53.716898 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:12:53.716971 kubelet[2569]: E0213 08:12:53.716927 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:53.716971 kubelet[2569]: E0213 08:12:53.716947 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:12:53.144000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.90.207:22-139.178.68.195:60686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:57.706821 env[1458]: time="2024-02-13T08:12:57.706688693Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:12:57.733302 env[1458]: time="2024-02-13T08:12:57.733228490Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:12:57.733476 kubelet[2569]: E0213 08:12:57.733428 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:12:57.733476 kubelet[2569]: E0213 08:12:57.733467 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:12:57.733666 kubelet[2569]: E0213 08:12:57.733498 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:12:57.733666 kubelet[2569]: E0213 08:12:57.733525 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:12:58.154953 systemd[1]: Started sshd@69-145.40.90.207:22-139.178.68.195:46022.service. Feb 13 08:12:58.154000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-145.40.90.207:22-139.178.68.195:46022 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:58.181796 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:12:58.181855 kernel: audit: type=1130 audit(1707811978.154:1892): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-145.40.90.207:22-139.178.68.195:46022 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:12:58.299000 audit[14210]: USER_ACCT pid=14210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.299927 sshd[14210]: Accepted publickey for core from 139.178.68.195 port 46022 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:12:58.301165 sshd[14210]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:12:58.303593 systemd-logind[1446]: New session 65 of user core. Feb 13 08:12:58.304183 systemd[1]: Started session-65.scope. Feb 13 08:12:58.381863 sshd[14210]: pam_unix(sshd:session): session closed for user core Feb 13 08:12:58.383275 systemd[1]: sshd@69-145.40.90.207:22-139.178.68.195:46022.service: Deactivated successfully. Feb 13 08:12:58.383710 systemd[1]: session-65.scope: Deactivated successfully. Feb 13 08:12:58.384084 systemd-logind[1446]: Session 65 logged out. Waiting for processes to exit. Feb 13 08:12:58.384515 systemd-logind[1446]: Removed session 65. Feb 13 08:12:58.300000 audit[14210]: CRED_ACQ pid=14210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.392697 kernel: audit: type=1101 audit(1707811978.299:1893): pid=14210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.392730 kernel: audit: type=1103 audit(1707811978.300:1894): pid=14210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.541062 kernel: audit: type=1006 audit(1707811978.300:1895): pid=14210 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=65 res=1 Feb 13 08:12:58.541097 kernel: audit: type=1300 audit(1707811978.300:1895): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb6c533f0 a2=3 a3=0 items=0 ppid=1 pid=14210 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:58.300000 audit[14210]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb6c533f0 a2=3 a3=0 items=0 ppid=1 pid=14210 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:12:58.633024 kernel: audit: type=1327 audit(1707811978.300:1895): proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:58.300000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:12:58.663456 kernel: audit: type=1105 audit(1707811978.305:1896): pid=14210 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.305000 audit[14210]: USER_START pid=14210 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.306000 audit[14212]: CRED_ACQ pid=14212 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.847005 kernel: audit: type=1103 audit(1707811978.306:1897): pid=14212 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.847038 kernel: audit: type=1106 audit(1707811978.381:1898): pid=14210 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.381000 audit[14210]: USER_END pid=14210 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.942536 kernel: audit: type=1104 audit(1707811978.381:1899): pid=14210 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.381000 audit[14210]: CRED_DISP pid=14210 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:12:58.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-145.40.90.207:22-139.178.68.195:46022 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:03.392071 systemd[1]: Started sshd@70-145.40.90.207:22-139.178.68.195:46036.service. Feb 13 08:13:03.391000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-145.40.90.207:22-139.178.68.195:46036 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:03.418880 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:13:03.418966 kernel: audit: type=1130 audit(1707811983.391:1901): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-145.40.90.207:22-139.178.68.195:46036 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:03.536000 audit[14234]: USER_ACCT pid=14234 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.536878 sshd[14234]: Accepted publickey for core from 139.178.68.195 port 46036 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:03.537908 sshd[14234]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:03.540291 systemd-logind[1446]: New session 66 of user core. Feb 13 08:13:03.540799 systemd[1]: Started session-66.scope. Feb 13 08:13:03.537000 audit[14234]: CRED_ACQ pid=14234 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.632805 sshd[14234]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:03.634296 systemd[1]: sshd@70-145.40.90.207:22-139.178.68.195:46036.service: Deactivated successfully. Feb 13 08:13:03.634772 systemd[1]: session-66.scope: Deactivated successfully. Feb 13 08:13:03.635149 systemd-logind[1446]: Session 66 logged out. Waiting for processes to exit. Feb 13 08:13:03.635836 systemd-logind[1446]: Removed session 66. Feb 13 08:13:03.720759 kernel: audit: type=1101 audit(1707811983.536:1902): pid=14234 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.720804 kernel: audit: type=1103 audit(1707811983.537:1903): pid=14234 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.720821 kernel: audit: type=1006 audit(1707811983.537:1904): pid=14234 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=66 res=1 Feb 13 08:13:03.779354 kernel: audit: type=1300 audit(1707811983.537:1904): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe64a66230 a2=3 a3=0 items=0 ppid=1 pid=14234 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:03.537000 audit[14234]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe64a66230 a2=3 a3=0 items=0 ppid=1 pid=14234 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:03.871264 kernel: audit: type=1327 audit(1707811983.537:1904): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:03.537000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:03.901714 kernel: audit: type=1105 audit(1707811983.542:1905): pid=14234 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.542000 audit[14234]: USER_START pid=14234 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.996120 kernel: audit: type=1103 audit(1707811983.543:1906): pid=14236 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.543000 audit[14236]: CRED_ACQ pid=14236 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:04.085312 kernel: audit: type=1106 audit(1707811983.632:1907): pid=14234 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.632000 audit[14234]: USER_END pid=14234 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:04.180751 kernel: audit: type=1104 audit(1707811983.632:1908): pid=14234 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.632000 audit[14234]: CRED_DISP pid=14234 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:03.633000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-145.40.90.207:22-139.178.68.195:46036 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:04.706887 env[1458]: time="2024-02-13T08:13:04.706759660Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:13:04.733257 env[1458]: time="2024-02-13T08:13:04.733195444Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:04.733392 kubelet[2569]: E0213 08:13:04.733380 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:13:04.733552 kubelet[2569]: E0213 08:13:04.733407 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:13:04.733552 kubelet[2569]: E0213 08:13:04.733430 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:04.733552 kubelet[2569]: E0213 08:13:04.733446 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:13:05.706351 env[1458]: time="2024-02-13T08:13:05.706219745Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:13:05.759182 env[1458]: time="2024-02-13T08:13:05.759089804Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:05.759539 kubelet[2569]: E0213 08:13:05.759358 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:13:05.759539 kubelet[2569]: E0213 08:13:05.759404 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:13:05.759539 kubelet[2569]: E0213 08:13:05.759446 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:05.759539 kubelet[2569]: E0213 08:13:05.759481 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:13:08.641748 systemd[1]: Started sshd@71-145.40.90.207:22-139.178.68.195:47230.service. Feb 13 08:13:08.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-145.40.90.207:22-139.178.68.195:47230 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:08.668484 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:13:08.668574 kernel: audit: type=1130 audit(1707811988.641:1910): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-145.40.90.207:22-139.178.68.195:47230 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:08.705366 env[1458]: time="2024-02-13T08:13:08.705340239Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:13:08.718451 env[1458]: time="2024-02-13T08:13:08.718406522Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:08.718623 kubelet[2569]: E0213 08:13:08.718597 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:13:08.718623 kubelet[2569]: E0213 08:13:08.718626 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:13:08.719253 kubelet[2569]: E0213 08:13:08.718683 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:08.719253 kubelet[2569]: E0213 08:13:08.718721 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:13:08.786000 audit[14320]: USER_ACCT pid=14320 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.787070 sshd[14320]: Accepted publickey for core from 139.178.68.195 port 47230 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:08.787919 sshd[14320]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:08.790366 systemd-logind[1446]: New session 67 of user core. Feb 13 08:13:08.790849 systemd[1]: Started session-67.scope. Feb 13 08:13:08.869419 sshd[14320]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:08.870859 systemd[1]: sshd@71-145.40.90.207:22-139.178.68.195:47230.service: Deactivated successfully. Feb 13 08:13:08.871301 systemd[1]: session-67.scope: Deactivated successfully. Feb 13 08:13:08.871594 systemd-logind[1446]: Session 67 logged out. Waiting for processes to exit. Feb 13 08:13:08.871975 systemd-logind[1446]: Removed session 67. Feb 13 08:13:08.787000 audit[14320]: CRED_ACQ pid=14320 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.968938 kernel: audit: type=1101 audit(1707811988.786:1911): pid=14320 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.968977 kernel: audit: type=1103 audit(1707811988.787:1912): pid=14320 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.968995 kernel: audit: type=1006 audit(1707811988.787:1913): pid=14320 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=67 res=1 Feb 13 08:13:08.787000 audit[14320]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf861faf0 a2=3 a3=0 items=0 ppid=1 pid=14320 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:09.119497 kernel: audit: type=1300 audit(1707811988.787:1913): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf861faf0 a2=3 a3=0 items=0 ppid=1 pid=14320 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:09.119528 kernel: audit: type=1327 audit(1707811988.787:1913): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:08.787000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:09.149954 kernel: audit: type=1105 audit(1707811988.792:1914): pid=14320 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.792000 audit[14320]: USER_START pid=14320 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:09.244400 kernel: audit: type=1103 audit(1707811988.793:1915): pid=14351 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.793000 audit[14351]: CRED_ACQ pid=14351 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:09.333618 kernel: audit: type=1106 audit(1707811988.869:1916): pid=14320 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.869000 audit[14320]: USER_END pid=14320 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:09.429152 kernel: audit: type=1104 audit(1707811988.869:1917): pid=14320 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.869000 audit[14320]: CRED_DISP pid=14320 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:08.870000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-145.40.90.207:22-139.178.68.195:47230 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:09.705535 env[1458]: time="2024-02-13T08:13:09.705462753Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:13:09.718461 env[1458]: time="2024-02-13T08:13:09.718400209Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:09.718571 kubelet[2569]: E0213 08:13:09.718544 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:13:09.718616 kubelet[2569]: E0213 08:13:09.718573 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:13:09.718616 kubelet[2569]: E0213 08:13:09.718599 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:09.718841 kubelet[2569]: E0213 08:13:09.718618 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:13:13.880575 systemd[1]: Started sshd@72-145.40.90.207:22-139.178.68.195:47236.service. Feb 13 08:13:13.880000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.90.207:22-139.178.68.195:47236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:13.907990 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:13:13.908068 kernel: audit: type=1130 audit(1707811993.880:1919): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.90.207:22-139.178.68.195:47236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:14.025000 audit[14401]: USER_ACCT pid=14401 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.026926 sshd[14401]: Accepted publickey for core from 139.178.68.195 port 47236 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:14.030793 sshd[14401]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:14.040531 systemd-logind[1446]: New session 68 of user core. Feb 13 08:13:14.042809 systemd[1]: Started session-68.scope. Feb 13 08:13:14.029000 audit[14401]: CRED_ACQ pid=14401 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.208674 kernel: audit: type=1101 audit(1707811994.025:1920): pid=14401 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.208750 kernel: audit: type=1103 audit(1707811994.029:1921): pid=14401 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.208768 kernel: audit: type=1006 audit(1707811994.029:1922): pid=14401 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=68 res=1 Feb 13 08:13:14.211207 sshd[14401]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:14.212620 systemd[1]: sshd@72-145.40.90.207:22-139.178.68.195:47236.service: Deactivated successfully. Feb 13 08:13:14.213035 systemd[1]: session-68.scope: Deactivated successfully. Feb 13 08:13:14.213399 systemd-logind[1446]: Session 68 logged out. Waiting for processes to exit. Feb 13 08:13:14.213984 systemd-logind[1446]: Removed session 68. Feb 13 08:13:14.029000 audit[14401]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdde652640 a2=3 a3=0 items=0 ppid=1 pid=14401 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:14.359439 kernel: audit: type=1300 audit(1707811994.029:1922): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdde652640 a2=3 a3=0 items=0 ppid=1 pid=14401 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:14.359482 kernel: audit: type=1327 audit(1707811994.029:1922): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:14.029000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:14.051000 audit[14401]: USER_START pid=14401 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.484544 kernel: audit: type=1105 audit(1707811994.051:1923): pid=14401 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.484624 kernel: audit: type=1103 audit(1707811994.052:1924): pid=14403 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.052000 audit[14403]: CRED_ACQ pid=14403 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.573856 kernel: audit: type=1106 audit(1707811994.211:1925): pid=14401 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.211000 audit[14401]: USER_END pid=14401 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.669432 kernel: audit: type=1104 audit(1707811994.211:1926): pid=14401 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.211000 audit[14401]: CRED_DISP pid=14401 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:14.212000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.90.207:22-139.178.68.195:47236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:17.456395 systemd[1]: Started sshd@73-145.40.90.207:22-203.172.76.4:45292.service. Feb 13 08:13:17.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-145.40.90.207:22-203.172.76.4:45292 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:18.629750 sshd[14426]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.172.76.4 user=root Feb 13 08:13:18.629000 audit[14426]: USER_AUTH pid=14426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=203.172.76.4 addr=203.172.76.4 terminal=ssh res=failed' Feb 13 08:13:18.706752 env[1458]: time="2024-02-13T08:13:18.706630504Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:13:18.744844 env[1458]: time="2024-02-13T08:13:18.744776280Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:18.745007 kubelet[2569]: E0213 08:13:18.744968 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:13:18.745007 kubelet[2569]: E0213 08:13:18.745008 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:13:18.745184 kubelet[2569]: E0213 08:13:18.745029 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:18.745184 kubelet[2569]: E0213 08:13:18.745046 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:13:19.221719 systemd[1]: Started sshd@74-145.40.90.207:22-139.178.68.195:40006.service. Feb 13 08:13:19.221000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.90.207:22-139.178.68.195:40006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:19.263067 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:13:19.263171 kernel: audit: type=1130 audit(1707811999.221:1930): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.90.207:22-139.178.68.195:40006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:19.293740 sshd[14460]: Accepted publickey for core from 139.178.68.195 port 40006 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:19.294949 sshd[14460]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:19.297669 systemd-logind[1446]: New session 69 of user core. Feb 13 08:13:19.298272 systemd[1]: Started session-69.scope. Feb 13 08:13:19.293000 audit[14460]: USER_ACCT pid=14460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.376978 sshd[14460]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:19.378273 systemd[1]: sshd@74-145.40.90.207:22-139.178.68.195:40006.service: Deactivated successfully. Feb 13 08:13:19.378711 systemd[1]: session-69.scope: Deactivated successfully. Feb 13 08:13:19.379152 systemd-logind[1446]: Session 69 logged out. Waiting for processes to exit. Feb 13 08:13:19.379557 systemd-logind[1446]: Removed session 69. Feb 13 08:13:19.442449 kernel: audit: type=1101 audit(1707811999.293:1931): pid=14460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.442545 kernel: audit: type=1103 audit(1707811999.294:1932): pid=14460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.294000 audit[14460]: CRED_ACQ pid=14460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.533109 kernel: audit: type=1006 audit(1707811999.294:1933): pid=14460 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=69 res=1 Feb 13 08:13:19.591806 kernel: audit: type=1300 audit(1707811999.294:1933): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdcc5d0260 a2=3 a3=0 items=0 ppid=1 pid=14460 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:19.294000 audit[14460]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdcc5d0260 a2=3 a3=0 items=0 ppid=1 pid=14460 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:19.683889 kernel: audit: type=1327 audit(1707811999.294:1933): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:19.294000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:19.299000 audit[14460]: USER_START pid=14460 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.808945 kernel: audit: type=1105 audit(1707811999.299:1934): pid=14460 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.809012 kernel: audit: type=1103 audit(1707811999.300:1935): pid=14462 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.300000 audit[14462]: CRED_ACQ pid=14462 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.898202 kernel: audit: type=1106 audit(1707811999.376:1936): pid=14460 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.376000 audit[14460]: USER_END pid=14460 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.376000 audit[14460]: CRED_DISP pid=14460 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:20.083274 kernel: audit: type=1104 audit(1707811999.376:1937): pid=14460 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:19.377000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.90.207:22-139.178.68.195:40006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:20.707248 env[1458]: time="2024-02-13T08:13:20.707099806Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:13:20.707248 env[1458]: time="2024-02-13T08:13:20.707125406Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:13:20.724564 env[1458]: time="2024-02-13T08:13:20.724501067Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:20.724756 kubelet[2569]: E0213 08:13:20.724710 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:13:20.724756 kubelet[2569]: E0213 08:13:20.724744 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:13:20.724983 kubelet[2569]: E0213 08:13:20.724768 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:20.724983 kubelet[2569]: E0213 08:13:20.724791 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:13:20.724983 kubelet[2569]: E0213 08:13:20.724904 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:13:20.724983 kubelet[2569]: E0213 08:13:20.724919 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:13:20.725115 env[1458]: time="2024-02-13T08:13:20.724818408Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:20.725143 kubelet[2569]: E0213 08:13:20.724939 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:20.725143 kubelet[2569]: E0213 08:13:20.724954 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:13:20.887572 sshd[14426]: Failed password for root from 203.172.76.4 port 45292 ssh2 Feb 13 08:13:22.275945 sshd[14426]: Received disconnect from 203.172.76.4 port 45292:11: Bye Bye [preauth] Feb 13 08:13:22.275945 sshd[14426]: Disconnected from authenticating user root 203.172.76.4 port 45292 [preauth] Feb 13 08:13:22.278156 systemd[1]: sshd@73-145.40.90.207:22-203.172.76.4:45292.service: Deactivated successfully. Feb 13 08:13:22.277000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-145.40.90.207:22-203.172.76.4:45292 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:24.386690 systemd[1]: Started sshd@75-145.40.90.207:22-139.178.68.195:40022.service. Feb 13 08:13:24.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-145.40.90.207:22-139.178.68.195:40022 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:24.413940 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:13:24.414047 kernel: audit: type=1130 audit(1707812004.386:1940): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-145.40.90.207:22-139.178.68.195:40022 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:24.530000 audit[14544]: USER_ACCT pid=14544 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.531096 sshd[14544]: Accepted publickey for core from 139.178.68.195 port 40022 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:24.531906 sshd[14544]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:24.534282 systemd-logind[1446]: New session 70 of user core. Feb 13 08:13:24.534881 systemd[1]: Started session-70.scope. Feb 13 08:13:24.616582 sshd[14544]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:24.618295 systemd[1]: sshd@75-145.40.90.207:22-139.178.68.195:40022.service: Deactivated successfully. Feb 13 08:13:24.618814 systemd[1]: session-70.scope: Deactivated successfully. Feb 13 08:13:24.619326 systemd-logind[1446]: Session 70 logged out. Waiting for processes to exit. Feb 13 08:13:24.619992 systemd-logind[1446]: Removed session 70. Feb 13 08:13:24.531000 audit[14544]: CRED_ACQ pid=14544 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.705884 env[1458]: time="2024-02-13T08:13:24.705815464Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:13:24.715139 kernel: audit: type=1101 audit(1707812004.530:1941): pid=14544 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.715221 kernel: audit: type=1103 audit(1707812004.531:1942): pid=14544 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.715239 kernel: audit: type=1006 audit(1707812004.531:1943): pid=14544 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=70 res=1 Feb 13 08:13:24.717777 env[1458]: time="2024-02-13T08:13:24.717702281Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:24.717936 kubelet[2569]: E0213 08:13:24.717897 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:13:24.717936 kubelet[2569]: E0213 08:13:24.717922 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:13:24.718132 kubelet[2569]: E0213 08:13:24.717945 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:24.718132 kubelet[2569]: E0213 08:13:24.717963 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:13:24.773775 kernel: audit: type=1300 audit(1707812004.531:1943): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe4ecab950 a2=3 a3=0 items=0 ppid=1 pid=14544 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:24.531000 audit[14544]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe4ecab950 a2=3 a3=0 items=0 ppid=1 pid=14544 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:24.865768 kernel: audit: type=1327 audit(1707812004.531:1943): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:24.531000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:24.896253 kernel: audit: type=1105 audit(1707812004.537:1944): pid=14544 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.537000 audit[14544]: USER_START pid=14544 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.990819 kernel: audit: type=1103 audit(1707812004.537:1945): pid=14546 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.537000 audit[14546]: CRED_ACQ pid=14546 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.616000 audit[14544]: USER_END pid=14544 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:25.175500 kernel: audit: type=1106 audit(1707812004.616:1946): pid=14544 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:25.175535 kernel: audit: type=1104 audit(1707812004.616:1947): pid=14544 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.616000 audit[14544]: CRED_DISP pid=14544 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:24.617000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-145.40.90.207:22-139.178.68.195:40022 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:29.628570 systemd[1]: Started sshd@76-145.40.90.207:22-139.178.68.195:44838.service. Feb 13 08:13:29.628000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-145.40.90.207:22-139.178.68.195:44838 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:29.657434 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:13:29.657510 kernel: audit: type=1130 audit(1707812009.628:1949): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-145.40.90.207:22-139.178.68.195:44838 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:29.774000 audit[14597]: USER_ACCT pid=14597 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.775667 sshd[14597]: Accepted publickey for core from 139.178.68.195 port 44838 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:29.776928 sshd[14597]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:29.779636 systemd-logind[1446]: New session 71 of user core. Feb 13 08:13:29.780753 systemd[1]: Started session-71.scope. Feb 13 08:13:29.859754 sshd[14597]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:29.861617 systemd[1]: sshd@76-145.40.90.207:22-139.178.68.195:44838.service: Deactivated successfully. Feb 13 08:13:29.862046 systemd[1]: session-71.scope: Deactivated successfully. Feb 13 08:13:29.862376 systemd-logind[1446]: Session 71 logged out. Waiting for processes to exit. Feb 13 08:13:29.862923 systemd-logind[1446]: Removed session 71. Feb 13 08:13:29.776000 audit[14597]: CRED_ACQ pid=14597 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.959066 kernel: audit: type=1101 audit(1707812009.774:1950): pid=14597 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.959110 kernel: audit: type=1103 audit(1707812009.776:1951): pid=14597 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.959129 kernel: audit: type=1006 audit(1707812009.776:1952): pid=14597 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=71 res=1 Feb 13 08:13:30.017719 kernel: audit: type=1300 audit(1707812009.776:1952): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd562b9cc0 a2=3 a3=0 items=0 ppid=1 pid=14597 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:29.776000 audit[14597]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd562b9cc0 a2=3 a3=0 items=0 ppid=1 pid=14597 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:30.109708 kernel: audit: type=1327 audit(1707812009.776:1952): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:29.776000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:30.140136 kernel: audit: type=1105 audit(1707812009.782:1953): pid=14597 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.782000 audit[14597]: USER_START pid=14597 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:30.234618 kernel: audit: type=1103 audit(1707812009.783:1954): pid=14599 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.783000 audit[14599]: CRED_ACQ pid=14599 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:30.323814 kernel: audit: type=1106 audit(1707812009.859:1955): pid=14597 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.859000 audit[14597]: USER_END pid=14597 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:30.419330 kernel: audit: type=1104 audit(1707812009.860:1956): pid=14597 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.860000 audit[14597]: CRED_DISP pid=14597 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:29.861000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-145.40.90.207:22-139.178.68.195:44838 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:30.707010 env[1458]: time="2024-02-13T08:13:30.706774089Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:13:30.736568 env[1458]: time="2024-02-13T08:13:30.736512078Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:30.736803 kubelet[2569]: E0213 08:13:30.736744 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:13:30.736803 kubelet[2569]: E0213 08:13:30.736800 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:13:30.737004 kubelet[2569]: E0213 08:13:30.736823 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:30.737004 kubelet[2569]: E0213 08:13:30.736842 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:13:34.706459 env[1458]: time="2024-02-13T08:13:34.706320352Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:13:34.732625 env[1458]: time="2024-02-13T08:13:34.732540877Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:34.732864 kubelet[2569]: E0213 08:13:34.732808 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:13:34.732864 kubelet[2569]: E0213 08:13:34.732847 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:13:34.733043 kubelet[2569]: E0213 08:13:34.732868 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:34.733043 kubelet[2569]: E0213 08:13:34.732886 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:13:34.871084 systemd[1]: Started sshd@77-145.40.90.207:22-139.178.68.195:44846.service. Feb 13 08:13:34.870000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.90.207:22-139.178.68.195:44846 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:34.898459 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:13:34.898495 kernel: audit: type=1130 audit(1707812014.870:1958): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.90.207:22-139.178.68.195:44846 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:35.015000 audit[14681]: USER_ACCT pid=14681 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.016116 sshd[14681]: Accepted publickey for core from 139.178.68.195 port 44846 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:35.018709 sshd[14681]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:35.026339 systemd-logind[1446]: New session 72 of user core. Feb 13 08:13:35.028092 systemd[1]: Started session-72.scope. Feb 13 08:13:35.017000 audit[14681]: CRED_ACQ pid=14681 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.197843 kernel: audit: type=1101 audit(1707812015.015:1959): pid=14681 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.197886 kernel: audit: type=1103 audit(1707812015.017:1960): pid=14681 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.197908 kernel: audit: type=1006 audit(1707812015.017:1961): pid=14681 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=72 res=1 Feb 13 08:13:35.256506 kernel: audit: type=1300 audit(1707812015.017:1961): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd513c02e0 a2=3 a3=0 items=0 ppid=1 pid=14681 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:35.017000 audit[14681]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd513c02e0 a2=3 a3=0 items=0 ppid=1 pid=14681 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:35.348619 kernel: audit: type=1327 audit(1707812015.017:1961): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:35.017000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:35.348854 sshd[14681]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:35.350353 systemd[1]: sshd@77-145.40.90.207:22-139.178.68.195:44846.service: Deactivated successfully. Feb 13 08:13:35.350778 systemd[1]: session-72.scope: Deactivated successfully. Feb 13 08:13:35.351226 systemd-logind[1446]: Session 72 logged out. Waiting for processes to exit. Feb 13 08:13:35.351726 systemd-logind[1446]: Removed session 72. Feb 13 08:13:35.379120 kernel: audit: type=1105 audit(1707812015.037:1962): pid=14681 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.037000 audit[14681]: USER_START pid=14681 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.473565 kernel: audit: type=1103 audit(1707812015.039:1963): pid=14683 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.039000 audit[14683]: CRED_ACQ pid=14683 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.348000 audit[14681]: USER_END pid=14681 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.658238 kernel: audit: type=1106 audit(1707812015.348:1964): pid=14681 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.658272 kernel: audit: type=1104 audit(1707812015.349:1965): pid=14681 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.349000 audit[14681]: CRED_DISP pid=14681 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:35.705227 env[1458]: time="2024-02-13T08:13:35.705207220Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:13:35.717216 env[1458]: time="2024-02-13T08:13:35.717145017Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:35.717449 kubelet[2569]: E0213 08:13:35.717332 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:13:35.717449 kubelet[2569]: E0213 08:13:35.717359 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:13:35.717449 kubelet[2569]: E0213 08:13:35.717382 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:35.717449 kubelet[2569]: E0213 08:13:35.717400 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:13:35.349000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.90.207:22-139.178.68.195:44846 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:37.706111 env[1458]: time="2024-02-13T08:13:37.705982022Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:13:37.733114 env[1458]: time="2024-02-13T08:13:37.733078830Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:37.733302 kubelet[2569]: E0213 08:13:37.733260 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:13:37.733302 kubelet[2569]: E0213 08:13:37.733284 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:13:37.733476 kubelet[2569]: E0213 08:13:37.733305 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:37.733476 kubelet[2569]: E0213 08:13:37.733324 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:13:40.300755 systemd[1]: Started sshd@78-145.40.90.207:22-139.178.68.195:36020.service. Feb 13 08:13:40.300000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-145.40.90.207:22-139.178.68.195:36020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:40.327552 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:13:40.327594 kernel: audit: type=1130 audit(1707812020.300:1967): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-145.40.90.207:22-139.178.68.195:36020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:40.443000 audit[14765]: USER_ACCT pid=14765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.444092 sshd[14765]: Accepted publickey for core from 139.178.68.195 port 36020 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:40.445873 sshd[14765]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:40.448271 systemd-logind[1446]: New session 73 of user core. Feb 13 08:13:40.448798 systemd[1]: Started session-73.scope. Feb 13 08:13:40.529865 sshd[14765]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:40.531314 systemd[1]: sshd@78-145.40.90.207:22-139.178.68.195:36020.service: Deactivated successfully. Feb 13 08:13:40.531787 systemd[1]: session-73.scope: Deactivated successfully. Feb 13 08:13:40.532229 systemd-logind[1446]: Session 73 logged out. Waiting for processes to exit. Feb 13 08:13:40.532616 systemd-logind[1446]: Removed session 73. Feb 13 08:13:40.445000 audit[14765]: CRED_ACQ pid=14765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.627884 kernel: audit: type=1101 audit(1707812020.443:1968): pid=14765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.627919 kernel: audit: type=1103 audit(1707812020.445:1969): pid=14765 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.627936 kernel: audit: type=1006 audit(1707812020.445:1970): pid=14765 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=73 res=1 Feb 13 08:13:40.686448 kernel: audit: type=1300 audit(1707812020.445:1970): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff1491ba50 a2=3 a3=0 items=0 ppid=1 pid=14765 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:40.445000 audit[14765]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff1491ba50 a2=3 a3=0 items=0 ppid=1 pid=14765 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:40.778422 kernel: audit: type=1327 audit(1707812020.445:1970): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:40.445000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:40.808899 kernel: audit: type=1105 audit(1707812020.450:1971): pid=14765 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.450000 audit[14765]: USER_START pid=14765 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.903324 kernel: audit: type=1103 audit(1707812020.451:1972): pid=14767 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.451000 audit[14767]: CRED_ACQ pid=14767 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.905424 systemd[1]: Started sshd@79-145.40.90.207:22-43.154.183.138:32934.service. Feb 13 08:13:40.529000 audit[14765]: USER_END pid=14765 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.992754 kernel: audit: type=1106 audit(1707812020.529:1973): pid=14765 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.529000 audit[14765]: CRED_DISP pid=14765 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:41.177352 kernel: audit: type=1104 audit(1707812020.529:1974): pid=14765 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:40.530000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-145.40.90.207:22-139.178.68.195:36020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:40.904000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.90.207:22-43.154.183.138:32934 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:41.821243 sshd[14790]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.154.183.138 user=root Feb 13 08:13:41.820000 audit[14790]: USER_AUTH pid=14790 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.154.183.138 addr=43.154.183.138 terminal=ssh res=failed' Feb 13 08:13:42.706905 env[1458]: time="2024-02-13T08:13:42.706813147Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:13:42.734091 env[1458]: time="2024-02-13T08:13:42.734056421Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:42.734312 kubelet[2569]: E0213 08:13:42.734271 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:13:42.734312 kubelet[2569]: E0213 08:13:42.734297 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:13:42.734503 kubelet[2569]: E0213 08:13:42.734319 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:42.734503 kubelet[2569]: E0213 08:13:42.734337 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:13:43.902972 sshd[14790]: Failed password for root from 43.154.183.138 port 32934 ssh2 Feb 13 08:13:45.419557 sshd[14790]: Received disconnect from 43.154.183.138 port 32934:11: Bye Bye [preauth] Feb 13 08:13:45.419557 sshd[14790]: Disconnected from authenticating user root 43.154.183.138 port 32934 [preauth] Feb 13 08:13:45.422135 systemd[1]: sshd@79-145.40.90.207:22-43.154.183.138:32934.service: Deactivated successfully. Feb 13 08:13:45.422000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.90.207:22-43.154.183.138:32934 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:45.448984 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:13:45.449026 kernel: audit: type=1131 audit(1707812025.422:1978): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.90.207:22-43.154.183.138:32934 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:45.533068 systemd[1]: Started sshd@80-145.40.90.207:22-139.178.68.195:36026.service. Feb 13 08:13:45.532000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-145.40.90.207:22-139.178.68.195:36026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:45.626904 kernel: audit: type=1130 audit(1707812025.532:1979): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-145.40.90.207:22-139.178.68.195:36026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:45.654000 audit[14826]: USER_ACCT pid=14826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.655010 sshd[14826]: Accepted publickey for core from 139.178.68.195 port 36026 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:45.656931 sshd[14826]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:45.659352 systemd-logind[1446]: New session 74 of user core. Feb 13 08:13:45.659824 systemd[1]: Started session-74.scope. Feb 13 08:13:45.705105 env[1458]: time="2024-02-13T08:13:45.705038186Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:13:45.718160 env[1458]: time="2024-02-13T08:13:45.718095326Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:45.718306 kubelet[2569]: E0213 08:13:45.718293 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:13:45.718498 kubelet[2569]: E0213 08:13:45.718325 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:13:45.718498 kubelet[2569]: E0213 08:13:45.718358 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:45.718498 kubelet[2569]: E0213 08:13:45.718385 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:13:45.738167 sshd[14826]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:45.739522 systemd[1]: sshd@80-145.40.90.207:22-139.178.68.195:36026.service: Deactivated successfully. Feb 13 08:13:45.739950 systemd[1]: session-74.scope: Deactivated successfully. Feb 13 08:13:45.740341 systemd-logind[1446]: Session 74 logged out. Waiting for processes to exit. Feb 13 08:13:45.740944 systemd-logind[1446]: Removed session 74. Feb 13 08:13:45.656000 audit[14826]: CRED_ACQ pid=14826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.837594 kernel: audit: type=1101 audit(1707812025.654:1980): pid=14826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.837687 kernel: audit: type=1103 audit(1707812025.656:1981): pid=14826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.837705 kernel: audit: type=1006 audit(1707812025.656:1982): pid=14826 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=74 res=1 Feb 13 08:13:45.656000 audit[14826]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffefef7c850 a2=3 a3=0 items=0 ppid=1 pid=14826 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:45.988286 kernel: audit: type=1300 audit(1707812025.656:1982): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffefef7c850 a2=3 a3=0 items=0 ppid=1 pid=14826 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:45.988323 kernel: audit: type=1327 audit(1707812025.656:1982): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:45.656000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:46.018722 kernel: audit: type=1105 audit(1707812025.661:1983): pid=14826 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.661000 audit[14826]: USER_START pid=14826 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:46.114139 kernel: audit: type=1103 audit(1707812025.661:1984): pid=14831 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.661000 audit[14831]: CRED_ACQ pid=14831 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.738000 audit[14826]: USER_END pid=14826 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:46.299100 kernel: audit: type=1106 audit(1707812025.738:1985): pid=14826 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.738000 audit[14826]: CRED_DISP pid=14826 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:45.739000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-145.40.90.207:22-139.178.68.195:36026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:46.123000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:46.123000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000db4240 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:46.123000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:46.123000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:46.123000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0013ba900 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:46.123000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:46.191000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:46.191000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00a0f6600 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:13:46.191000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:13:46.191000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:46.191000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c00477f3c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:13:46.191000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:13:46.191000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:46.191000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c000ff1650 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:13:46.191000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:13:46.945000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:46.945000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c004133500 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:13:46.945000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:13:46.945000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:46.945000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c013fd11a0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:13:46.945000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:13:46.945000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:46.945000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00806a000 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:13:46.945000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:13:47.706376 env[1458]: time="2024-02-13T08:13:47.706276027Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:13:47.756174 env[1458]: time="2024-02-13T08:13:47.756103250Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:47.756420 kubelet[2569]: E0213 08:13:47.756398 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:13:47.756781 kubelet[2569]: E0213 08:13:47.756446 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:13:47.756781 kubelet[2569]: E0213 08:13:47.756500 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:47.756781 kubelet[2569]: E0213 08:13:47.756540 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:13:49.705868 env[1458]: time="2024-02-13T08:13:49.705840525Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:13:49.719215 env[1458]: time="2024-02-13T08:13:49.719156817Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:49.719378 kubelet[2569]: E0213 08:13:49.719333 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:13:49.719378 kubelet[2569]: E0213 08:13:49.719362 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:13:49.719594 kubelet[2569]: E0213 08:13:49.719390 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:49.719594 kubelet[2569]: E0213 08:13:49.719410 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:13:50.747947 systemd[1]: Started sshd@81-145.40.90.207:22-139.178.68.195:56296.service. Feb 13 08:13:50.747000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.90.207:22-139.178.68.195:56296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:50.775158 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:13:50.775324 kernel: audit: type=1130 audit(1707812030.747:1996): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.90.207:22-139.178.68.195:56296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:50.877000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:50.893642 sshd[14942]: Accepted publickey for core from 139.178.68.195 port 56296 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:50.894218 sshd[14942]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:50.896670 systemd-logind[1446]: New session 75 of user core. Feb 13 08:13:50.897134 systemd[1]: Started session-75.scope. Feb 13 08:13:50.877000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0010a0480 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:50.976929 sshd[14942]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:50.978276 systemd[1]: sshd@81-145.40.90.207:22-139.178.68.195:56296.service: Deactivated successfully. Feb 13 08:13:50.978872 systemd[1]: session-75.scope: Deactivated successfully. Feb 13 08:13:50.979316 systemd-logind[1446]: Session 75 logged out. Waiting for processes to exit. Feb 13 08:13:50.979844 systemd-logind[1446]: Removed session 75. Feb 13 08:13:51.088029 kernel: audit: type=1400 audit(1707812030.877:1997): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:51.088066 kernel: audit: type=1300 audit(1707812030.877:1997): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0010a0480 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:51.088085 kernel: audit: type=1327 audit(1707812030.877:1997): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:50.877000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:51.181015 kernel: audit: type=1400 audit(1707812030.877:1998): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:50.877000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:51.270862 kernel: audit: type=1300 audit(1707812030.877:1998): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002c3f8e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:50.877000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002c3f8e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:51.391305 kernel: audit: type=1327 audit(1707812030.877:1998): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:50.877000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:51.484942 kernel: audit: type=1400 audit(1707812030.878:1999): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:50.878000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:51.575495 kernel: audit: type=1300 audit(1707812030.878:1999): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002b26f80 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:50.878000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002b26f80 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:51.696330 kernel: audit: type=1327 audit(1707812030.878:1999): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:50.878000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:50.883000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:13:50.883000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002b270c0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:13:50.883000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:13:50.888000 audit[14942]: USER_ACCT pid=14942 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:50.893000 audit[14942]: CRED_ACQ pid=14942 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:50.893000 audit[14942]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe856bbee0 a2=3 a3=0 items=0 ppid=1 pid=14942 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:50.893000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:50.898000 audit[14942]: USER_START pid=14942 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:50.898000 audit[14944]: CRED_ACQ pid=14944 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:50.973000 audit[14942]: USER_END pid=14942 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:50.973000 audit[14942]: CRED_DISP pid=14942 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:50.973000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.90.207:22-139.178.68.195:56296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:53.706896 env[1458]: time="2024-02-13T08:13:53.706804592Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:13:53.755771 env[1458]: time="2024-02-13T08:13:53.755705148Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:13:53.755992 kubelet[2569]: E0213 08:13:53.755967 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:13:53.756381 kubelet[2569]: E0213 08:13:53.756023 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:13:53.756381 kubelet[2569]: E0213 08:13:53.756074 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:13:53.756381 kubelet[2569]: E0213 08:13:53.756120 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:13:55.989831 systemd[1]: Started sshd@82-145.40.90.207:22-139.178.68.195:56300.service. Feb 13 08:13:55.989000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-145.40.90.207:22-139.178.68.195:56300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:56.017461 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:13:56.017524 kernel: audit: type=1130 audit(1707812035.989:2009): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-145.40.90.207:22-139.178.68.195:56300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:13:56.134000 audit[14996]: USER_ACCT pid=14996 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.134940 sshd[14996]: Accepted publickey for core from 139.178.68.195 port 56300 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:13:56.136913 sshd[14996]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:13:56.139539 systemd-logind[1446]: New session 76 of user core. Feb 13 08:13:56.140003 systemd[1]: Started session-76.scope. Feb 13 08:13:56.221365 sshd[14996]: pam_unix(sshd:session): session closed for user core Feb 13 08:13:56.222849 systemd[1]: sshd@82-145.40.90.207:22-139.178.68.195:56300.service: Deactivated successfully. Feb 13 08:13:56.223280 systemd[1]: session-76.scope: Deactivated successfully. Feb 13 08:13:56.223558 systemd-logind[1446]: Session 76 logged out. Waiting for processes to exit. Feb 13 08:13:56.224050 systemd-logind[1446]: Removed session 76. Feb 13 08:13:56.136000 audit[14996]: CRED_ACQ pid=14996 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.319253 kernel: audit: type=1101 audit(1707812036.134:2010): pid=14996 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.319290 kernel: audit: type=1103 audit(1707812036.136:2011): pid=14996 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.319305 kernel: audit: type=1006 audit(1707812036.136:2012): pid=14996 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=76 res=1 Feb 13 08:13:56.378440 kernel: audit: type=1300 audit(1707812036.136:2012): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcbde35960 a2=3 a3=0 items=0 ppid=1 pid=14996 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:56.136000 audit[14996]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcbde35960 a2=3 a3=0 items=0 ppid=1 pid=14996 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:13:56.471469 kernel: audit: type=1327 audit(1707812036.136:2012): proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:56.136000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:13:56.502389 kernel: audit: type=1105 audit(1707812036.141:2013): pid=14996 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.141000 audit[14996]: USER_START pid=14996 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.598154 kernel: audit: type=1103 audit(1707812036.142:2014): pid=14998 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.142000 audit[14998]: CRED_ACQ pid=14998 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.687770 kernel: audit: type=1106 audit(1707812036.221:2015): pid=14996 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.221000 audit[14996]: USER_END pid=14996 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.783597 kernel: audit: type=1104 audit(1707812036.221:2016): pid=14996 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.221000 audit[14996]: CRED_DISP pid=14996 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:13:56.222000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-145.40.90.207:22-139.178.68.195:56300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:00.706886 env[1458]: time="2024-02-13T08:14:00.706783523Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:14:00.732924 env[1458]: time="2024-02-13T08:14:00.732863037Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:00.733089 kubelet[2569]: E0213 08:14:00.733053 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:14:00.733089 kubelet[2569]: E0213 08:14:00.733080 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:14:00.733272 kubelet[2569]: E0213 08:14:00.733100 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:00.733272 kubelet[2569]: E0213 08:14:00.733118 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:14:01.224470 systemd[1]: Started sshd@83-145.40.90.207:22-139.178.68.195:56516.service. Feb 13 08:14:01.223000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-145.40.90.207:22-139.178.68.195:56516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:01.251765 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:14:01.251848 kernel: audit: type=1130 audit(1707812041.223:2018): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-145.40.90.207:22-139.178.68.195:56516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:01.370000 audit[15049]: USER_ACCT pid=15049 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.371125 sshd[15049]: Accepted publickey for core from 139.178.68.195 port 56516 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:01.371694 sshd[15049]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:01.374145 systemd-logind[1446]: New session 77 of user core. Feb 13 08:14:01.374581 systemd[1]: Started session-77.scope. Feb 13 08:14:01.454708 sshd[15049]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:01.456208 systemd[1]: sshd@83-145.40.90.207:22-139.178.68.195:56516.service: Deactivated successfully. Feb 13 08:14:01.456643 systemd[1]: session-77.scope: Deactivated successfully. Feb 13 08:14:01.456987 systemd-logind[1446]: Session 77 logged out. Waiting for processes to exit. Feb 13 08:14:01.457414 systemd-logind[1446]: Removed session 77. Feb 13 08:14:01.370000 audit[15049]: CRED_ACQ pid=15049 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.553459 kernel: audit: type=1101 audit(1707812041.370:2019): pid=15049 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.553505 kernel: audit: type=1103 audit(1707812041.370:2020): pid=15049 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.553522 kernel: audit: type=1006 audit(1707812041.370:2021): pid=15049 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=77 res=1 Feb 13 08:14:01.612061 kernel: audit: type=1300 audit(1707812041.370:2021): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7e812630 a2=3 a3=0 items=0 ppid=1 pid=15049 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:01.370000 audit[15049]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7e812630 a2=3 a3=0 items=0 ppid=1 pid=15049 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:01.704017 kernel: audit: type=1327 audit(1707812041.370:2021): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:01.370000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:01.705523 env[1458]: time="2024-02-13T08:14:01.705478737Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:14:01.705523 env[1458]: time="2024-02-13T08:14:01.705479138Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:14:01.717572 env[1458]: time="2024-02-13T08:14:01.717534343Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:01.717836 env[1458]: time="2024-02-13T08:14:01.717566485Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:01.717863 kubelet[2569]: E0213 08:14:01.717722 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:14:01.717863 kubelet[2569]: E0213 08:14:01.717752 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:14:01.717863 kubelet[2569]: E0213 08:14:01.717776 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:01.717863 kubelet[2569]: E0213 08:14:01.717795 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:14:01.718007 kubelet[2569]: E0213 08:14:01.717721 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:14:01.718007 kubelet[2569]: E0213 08:14:01.717811 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:14:01.718007 kubelet[2569]: E0213 08:14:01.717830 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:01.718007 kubelet[2569]: E0213 08:14:01.717843 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:14:01.734473 kernel: audit: type=1105 audit(1707812041.375:2022): pid=15049 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.375000 audit[15049]: USER_START pid=15049 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.829033 kernel: audit: type=1103 audit(1707812041.376:2023): pid=15051 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.376000 audit[15051]: CRED_ACQ pid=15051 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.918315 kernel: audit: type=1106 audit(1707812041.454:2024): pid=15049 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.454000 audit[15049]: USER_END pid=15049 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.454000 audit[15049]: CRED_DISP pid=15049 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:02.103308 kernel: audit: type=1104 audit(1707812041.454:2025): pid=15049 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:01.455000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-145.40.90.207:22-139.178.68.195:56516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:05.706789 env[1458]: time="2024-02-13T08:14:05.706626087Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:14:05.757836 env[1458]: time="2024-02-13T08:14:05.757777449Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:05.758099 kubelet[2569]: E0213 08:14:05.758050 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:14:05.758099 kubelet[2569]: E0213 08:14:05.758092 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:14:05.758470 kubelet[2569]: E0213 08:14:05.758135 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:05.758470 kubelet[2569]: E0213 08:14:05.758170 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:14:06.464750 systemd[1]: Started sshd@84-145.40.90.207:22-139.178.68.195:41674.service. Feb 13 08:14:06.464000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-145.40.90.207:22-139.178.68.195:41674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:06.491602 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:14:06.491677 kernel: audit: type=1130 audit(1707812046.464:2027): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-145.40.90.207:22-139.178.68.195:41674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:06.609000 audit[15161]: USER_ACCT pid=15161 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:06.610810 sshd[15161]: Accepted publickey for core from 139.178.68.195 port 41674 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:06.614919 sshd[15161]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:06.619059 systemd-logind[1446]: New session 78 of user core. Feb 13 08:14:06.619536 systemd[1]: Started session-78.scope. Feb 13 08:14:06.698768 sshd[15161]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:06.700229 systemd[1]: sshd@84-145.40.90.207:22-139.178.68.195:41674.service: Deactivated successfully. Feb 13 08:14:06.700666 systemd[1]: session-78.scope: Deactivated successfully. Feb 13 08:14:06.701082 systemd-logind[1446]: Session 78 logged out. Waiting for processes to exit. Feb 13 08:14:06.701558 systemd-logind[1446]: Removed session 78. Feb 13 08:14:06.613000 audit[15161]: CRED_ACQ pid=15161 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:06.793016 kernel: audit: type=1101 audit(1707812046.609:2028): pid=15161 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:06.793057 kernel: audit: type=1103 audit(1707812046.613:2029): pid=15161 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:06.793076 kernel: audit: type=1006 audit(1707812046.613:2030): pid=15161 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=78 res=1 Feb 13 08:14:06.851713 kernel: audit: type=1300 audit(1707812046.613:2030): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff16c586d0 a2=3 a3=0 items=0 ppid=1 pid=15161 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:06.613000 audit[15161]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff16c586d0 a2=3 a3=0 items=0 ppid=1 pid=15161 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:06.943774 kernel: audit: type=1327 audit(1707812046.613:2030): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:06.613000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:06.620000 audit[15161]: USER_START pid=15161 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:07.068784 kernel: audit: type=1105 audit(1707812046.620:2031): pid=15161 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:07.068825 kernel: audit: type=1103 audit(1707812046.621:2032): pid=15163 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:06.621000 audit[15163]: CRED_ACQ pid=15163 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:07.158069 kernel: audit: type=1106 audit(1707812046.698:2033): pid=15161 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:06.698000 audit[15161]: USER_END pid=15161 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:06.698000 audit[15161]: CRED_DISP pid=15161 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:07.254698 kernel: audit: type=1104 audit(1707812046.698:2034): pid=15161 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:06.699000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-145.40.90.207:22-139.178.68.195:41674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:11.711812 systemd[1]: Started sshd@85-145.40.90.207:22-139.178.68.195:41686.service. Feb 13 08:14:11.711000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.90.207:22-139.178.68.195:41686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:11.738821 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:14:11.738932 kernel: audit: type=1130 audit(1707812051.711:2036): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.90.207:22-139.178.68.195:41686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:11.855000 audit[15186]: USER_ACCT pid=15186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:11.856561 sshd[15186]: Accepted publickey for core from 139.178.68.195 port 41686 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:11.857957 sshd[15186]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:11.860194 systemd-logind[1446]: New session 79 of user core. Feb 13 08:14:11.860655 systemd[1]: Started session-79.scope. Feb 13 08:14:11.941323 sshd[15186]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:11.943064 systemd[1]: sshd@85-145.40.90.207:22-139.178.68.195:41686.service: Deactivated successfully. Feb 13 08:14:11.943596 systemd[1]: session-79.scope: Deactivated successfully. Feb 13 08:14:11.944100 systemd-logind[1446]: Session 79 logged out. Waiting for processes to exit. Feb 13 08:14:11.944763 systemd-logind[1446]: Removed session 79. Feb 13 08:14:11.857000 audit[15186]: CRED_ACQ pid=15186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:12.040442 kernel: audit: type=1101 audit(1707812051.855:2037): pid=15186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:12.040485 kernel: audit: type=1103 audit(1707812051.857:2038): pid=15186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:12.040507 kernel: audit: type=1006 audit(1707812051.857:2039): pid=15186 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=79 res=1 Feb 13 08:14:12.099114 kernel: audit: type=1300 audit(1707812051.857:2039): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce1a4af20 a2=3 a3=0 items=0 ppid=1 pid=15186 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:11.857000 audit[15186]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce1a4af20 a2=3 a3=0 items=0 ppid=1 pid=15186 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:12.191166 kernel: audit: type=1327 audit(1707812051.857:2039): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:11.857000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:12.221643 kernel: audit: type=1105 audit(1707812051.862:2040): pid=15186 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:11.862000 audit[15186]: USER_START pid=15186 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:12.316143 kernel: audit: type=1103 audit(1707812051.863:2041): pid=15188 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:11.863000 audit[15188]: CRED_ACQ pid=15188 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:12.405410 kernel: audit: type=1106 audit(1707812051.941:2042): pid=15186 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:11.941000 audit[15186]: USER_END pid=15186 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:12.500919 kernel: audit: type=1104 audit(1707812051.941:2043): pid=15186 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:11.941000 audit[15186]: CRED_DISP pid=15186 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:11.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.90.207:22-139.178.68.195:41686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:12.706968 env[1458]: time="2024-02-13T08:14:12.706878532Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:14:12.734124 env[1458]: time="2024-02-13T08:14:12.734061976Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:12.734292 kubelet[2569]: E0213 08:14:12.734247 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:14:12.734292 kubelet[2569]: E0213 08:14:12.734273 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:14:12.734481 kubelet[2569]: E0213 08:14:12.734296 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:12.734481 kubelet[2569]: E0213 08:14:12.734315 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:14:14.707910 env[1458]: time="2024-02-13T08:14:14.707854000Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:14:14.707910 env[1458]: time="2024-02-13T08:14:14.707854745Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:14:14.730550 env[1458]: time="2024-02-13T08:14:14.730480114Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:14.730550 env[1458]: time="2024-02-13T08:14:14.730505772Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:14.730714 kubelet[2569]: E0213 08:14:14.730684 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:14:14.730714 kubelet[2569]: E0213 08:14:14.730714 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:14:14.730888 kubelet[2569]: E0213 08:14:14.730737 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:14.730888 kubelet[2569]: E0213 08:14:14.730684 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:14:14.730888 kubelet[2569]: E0213 08:14:14.730755 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:14:14.730888 kubelet[2569]: E0213 08:14:14.730769 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:14:14.731007 kubelet[2569]: E0213 08:14:14.730790 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:14.731007 kubelet[2569]: E0213 08:14:14.730804 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:14:16.707010 env[1458]: time="2024-02-13T08:14:16.706890250Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:14:16.739392 env[1458]: time="2024-02-13T08:14:16.739353890Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:16.739564 kubelet[2569]: E0213 08:14:16.739552 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:14:16.739763 kubelet[2569]: E0213 08:14:16.739583 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:14:16.739763 kubelet[2569]: E0213 08:14:16.739611 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:16.739763 kubelet[2569]: E0213 08:14:16.739641 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:14:16.953084 systemd[1]: Started sshd@86-145.40.90.207:22-139.178.68.195:49238.service. Feb 13 08:14:16.952000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-145.40.90.207:22-139.178.68.195:49238 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:16.993421 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:14:16.993501 kernel: audit: type=1130 audit(1707812056.952:2045): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-145.40.90.207:22-139.178.68.195:49238 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:17.109000 audit[15325]: USER_ACCT pid=15325 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.109968 sshd[15325]: Accepted publickey for core from 139.178.68.195 port 49238 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:17.111639 sshd[15325]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:17.114146 systemd-logind[1446]: New session 80 of user core. Feb 13 08:14:17.114640 systemd[1]: Started session-80.scope. Feb 13 08:14:17.110000 audit[15325]: CRED_ACQ pid=15325 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.291879 kernel: audit: type=1101 audit(1707812057.109:2046): pid=15325 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.291925 kernel: audit: type=1103 audit(1707812057.110:2047): pid=15325 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.291943 kernel: audit: type=1006 audit(1707812057.110:2048): pid=15325 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=80 res=1 Feb 13 08:14:17.350548 kernel: audit: type=1300 audit(1707812057.110:2048): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0d049f30 a2=3 a3=0 items=0 ppid=1 pid=15325 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:17.110000 audit[15325]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0d049f30 a2=3 a3=0 items=0 ppid=1 pid=15325 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:17.110000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:17.442897 sshd[15325]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:17.444318 systemd[1]: sshd@86-145.40.90.207:22-139.178.68.195:49238.service: Deactivated successfully. Feb 13 08:14:17.444754 systemd[1]: session-80.scope: Deactivated successfully. Feb 13 08:14:17.445192 systemd-logind[1446]: Session 80 logged out. Waiting for processes to exit. Feb 13 08:14:17.445611 systemd-logind[1446]: Removed session 80. Feb 13 08:14:17.116000 audit[15325]: USER_START pid=15325 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.567678 kernel: audit: type=1327 audit(1707812057.110:2048): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:17.567784 kernel: audit: type=1105 audit(1707812057.116:2049): pid=15325 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.567811 kernel: audit: type=1103 audit(1707812057.117:2050): pid=15327 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.117000 audit[15327]: CRED_ACQ pid=15327 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.657023 kernel: audit: type=1106 audit(1707812057.442:2051): pid=15325 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.442000 audit[15325]: USER_END pid=15325 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.442000 audit[15325]: CRED_DISP pid=15325 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.841939 kernel: audit: type=1104 audit(1707812057.442:2052): pid=15325 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:17.443000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-145.40.90.207:22-139.178.68.195:49238 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:22.396029 systemd[1]: Started sshd@87-145.40.90.207:22-139.178.68.195:49254.service. Feb 13 08:14:22.395000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-145.40.90.207:22-139.178.68.195:49254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:22.422909 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:14:22.423036 kernel: audit: type=1130 audit(1707812062.395:2054): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-145.40.90.207:22-139.178.68.195:49254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:22.540000 audit[15350]: USER_ACCT pid=15350 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.541606 sshd[15350]: Accepted publickey for core from 139.178.68.195 port 49254 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:22.544992 sshd[15350]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:22.550016 systemd-logind[1446]: New session 81 of user core. Feb 13 08:14:22.550474 systemd[1]: Started session-81.scope. Feb 13 08:14:22.631947 sshd[15350]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:22.633452 systemd[1]: sshd@87-145.40.90.207:22-139.178.68.195:49254.service: Deactivated successfully. Feb 13 08:14:22.633977 systemd[1]: session-81.scope: Deactivated successfully. Feb 13 08:14:22.634431 systemd-logind[1446]: Session 81 logged out. Waiting for processes to exit. Feb 13 08:14:22.543000 audit[15350]: CRED_ACQ pid=15350 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.634969 systemd-logind[1446]: Removed session 81. Feb 13 08:14:22.725575 kernel: audit: type=1101 audit(1707812062.540:2055): pid=15350 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.725612 kernel: audit: type=1103 audit(1707812062.543:2056): pid=15350 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.725630 kernel: audit: type=1006 audit(1707812062.543:2057): pid=15350 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=81 res=1 Feb 13 08:14:22.784290 kernel: audit: type=1300 audit(1707812062.543:2057): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffbffa4e30 a2=3 a3=0 items=0 ppid=1 pid=15350 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:22.543000 audit[15350]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffbffa4e30 a2=3 a3=0 items=0 ppid=1 pid=15350 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:22.876361 kernel: audit: type=1327 audit(1707812062.543:2057): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:22.543000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:22.906871 kernel: audit: type=1105 audit(1707812062.551:2058): pid=15350 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.551000 audit[15350]: USER_START pid=15350 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:23.001571 kernel: audit: type=1103 audit(1707812062.552:2059): pid=15352 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.552000 audit[15352]: CRED_ACQ pid=15352 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:23.090887 kernel: audit: type=1106 audit(1707812062.631:2060): pid=15350 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.631000 audit[15350]: USER_END pid=15350 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:23.186516 kernel: audit: type=1104 audit(1707812062.631:2061): pid=15350 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.631000 audit[15350]: CRED_DISP pid=15350 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:22.632000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-145.40.90.207:22-139.178.68.195:49254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:25.706910 env[1458]: time="2024-02-13T08:14:25.706811193Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:14:25.736188 env[1458]: time="2024-02-13T08:14:25.736118727Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:25.736363 kubelet[2569]: E0213 08:14:25.736350 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:14:25.736548 kubelet[2569]: E0213 08:14:25.736380 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:14:25.736548 kubelet[2569]: E0213 08:14:25.736412 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:25.736548 kubelet[2569]: E0213 08:14:25.736440 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:14:26.706451 env[1458]: time="2024-02-13T08:14:26.706346278Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:14:26.735070 env[1458]: time="2024-02-13T08:14:26.735010543Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:26.735394 kubelet[2569]: E0213 08:14:26.735253 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:14:26.735394 kubelet[2569]: E0213 08:14:26.735285 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:14:26.735394 kubelet[2569]: E0213 08:14:26.735317 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:26.735394 kubelet[2569]: E0213 08:14:26.735342 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:14:27.641833 systemd[1]: Started sshd@88-145.40.90.207:22-139.178.68.195:37306.service. Feb 13 08:14:27.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.90.207:22-139.178.68.195:37306 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:27.668786 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:14:27.668835 kernel: audit: type=1130 audit(1707812067.641:2063): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.90.207:22-139.178.68.195:37306 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:27.786000 audit[15430]: USER_ACCT pid=15430 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.787511 sshd[15430]: Accepted publickey for core from 139.178.68.195 port 37306 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:27.788931 sshd[15430]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:27.791816 systemd-logind[1446]: New session 82 of user core. Feb 13 08:14:27.792954 systemd[1]: Started session-82.scope. Feb 13 08:14:27.872735 sshd[15430]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:27.874546 systemd[1]: sshd@88-145.40.90.207:22-139.178.68.195:37306.service: Deactivated successfully. Feb 13 08:14:27.875326 systemd[1]: session-82.scope: Deactivated successfully. Feb 13 08:14:27.875967 systemd-logind[1446]: Session 82 logged out. Waiting for processes to exit. Feb 13 08:14:27.876493 systemd-logind[1446]: Removed session 82. Feb 13 08:14:27.788000 audit[15430]: CRED_ACQ pid=15430 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.880677 kernel: audit: type=1101 audit(1707812067.786:2064): pid=15430 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.880717 kernel: audit: type=1103 audit(1707812067.788:2065): pid=15430 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.881804 systemd[1]: Started sshd@89-145.40.90.207:22-139.178.68.195:37310.service. Feb 13 08:14:28.029603 kernel: audit: type=1006 audit(1707812067.788:2066): pid=15430 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=82 res=1 Feb 13 08:14:28.029648 kernel: audit: type=1300 audit(1707812067.788:2066): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc4a5f3320 a2=3 a3=0 items=0 ppid=1 pid=15430 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:27.788000 audit[15430]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc4a5f3320 a2=3 a3=0 items=0 ppid=1 pid=15430 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:28.057508 sshd[15455]: Accepted publickey for core from 139.178.68.195 port 37310 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:28.058833 sshd[15455]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:28.061298 systemd-logind[1446]: New session 83 of user core. Feb 13 08:14:28.061757 systemd[1]: Started session-83.scope. Feb 13 08:14:28.121714 kernel: audit: type=1327 audit(1707812067.788:2066): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:27.788000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:28.152202 kernel: audit: type=1105 audit(1707812067.794:2067): pid=15430 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.794000 audit[15430]: USER_START pid=15430 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.795000 audit[15432]: CRED_ACQ pid=15432 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.336043 kernel: audit: type=1103 audit(1707812067.795:2068): pid=15432 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.872000 audit[15430]: USER_END pid=15430 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.431782 kernel: audit: type=1106 audit(1707812067.872:2069): pid=15430 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.431819 kernel: audit: type=1104 audit(1707812067.872:2070): pid=15430 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.872000 audit[15430]: CRED_DISP pid=15430 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:27.874000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.90.207:22-139.178.68.195:37306 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:27.881000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-145.40.90.207:22-139.178.68.195:37310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:28.056000 audit[15455]: USER_ACCT pid=15455 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.057000 audit[15455]: CRED_ACQ pid=15455 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.057000 audit[15455]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdd37b8950 a2=3 a3=0 items=0 ppid=1 pid=15455 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=83 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:28.057000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:28.063000 audit[15455]: USER_START pid=15455 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.063000 audit[15457]: CRED_ACQ pid=15457 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.705432 env[1458]: time="2024-02-13T08:14:28.705358014Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:14:28.719797 env[1458]: time="2024-02-13T08:14:28.719741463Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:28.719931 kubelet[2569]: E0213 08:14:28.719918 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:14:28.720124 kubelet[2569]: E0213 08:14:28.719952 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:14:28.720124 kubelet[2569]: E0213 08:14:28.719985 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:28.720124 kubelet[2569]: E0213 08:14:28.720011 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:14:28.953266 sshd[15455]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:28.955000 audit[15455]: USER_END pid=15455 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.955000 audit[15455]: CRED_DISP pid=15455 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.960551 systemd[1]: sshd@89-145.40.90.207:22-139.178.68.195:37310.service: Deactivated successfully. Feb 13 08:14:28.960000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-145.40.90.207:22-139.178.68.195:37310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:28.961499 systemd[1]: session-83.scope: Deactivated successfully. Feb 13 08:14:28.962051 systemd-logind[1446]: Session 83 logged out. Waiting for processes to exit. Feb 13 08:14:28.962741 systemd[1]: Started sshd@90-145.40.90.207:22-139.178.68.195:37312.service. Feb 13 08:14:28.962000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-145.40.90.207:22-139.178.68.195:37312 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:28.963394 systemd-logind[1446]: Removed session 83. Feb 13 08:14:28.996000 audit[15504]: USER_ACCT pid=15504 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.996965 sshd[15504]: Accepted publickey for core from 139.178.68.195 port 37312 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:28.996000 audit[15504]: CRED_ACQ pid=15504 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:28.996000 audit[15504]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc10bef530 a2=3 a3=0 items=0 ppid=1 pid=15504 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:28.996000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:28.997839 sshd[15504]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:29.001115 systemd-logind[1446]: New session 84 of user core. Feb 13 08:14:29.001973 systemd[1]: Started session-84.scope. Feb 13 08:14:29.005000 audit[15504]: USER_START pid=15504 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.006000 audit[15508]: CRED_ACQ pid=15508 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.620000 audit[15532]: NETFILTER_CFG table=filter:97 family=2 entries=26 op=nft_register_rule pid=15532 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:14:29.620000 audit[15532]: SYSCALL arch=c000003e syscall=46 success=yes exit=13404 a0=3 a1=7ffe8d0d8a70 a2=0 a3=7ffe8d0d8a5c items=0 ppid=2822 pid=15532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:29.620000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:14:29.631683 sshd[15504]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:29.631000 audit[15532]: NETFILTER_CFG table=nat:98 family=2 entries=14 op=nft_register_rule pid=15532 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:14:29.631000 audit[15532]: SYSCALL arch=c000003e syscall=46 success=yes exit=3300 a0=3 a1=7ffe8d0d8a70 a2=0 a3=31030 items=0 ppid=2822 pid=15532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:29.631000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:14:29.631000 audit[15504]: USER_END pid=15504 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.631000 audit[15504]: CRED_DISP pid=15504 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.634430 systemd[1]: sshd@90-145.40.90.207:22-139.178.68.195:37312.service: Deactivated successfully. Feb 13 08:14:29.633000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-145.40.90.207:22-139.178.68.195:37312 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:29.634967 systemd[1]: session-84.scope: Deactivated successfully. Feb 13 08:14:29.635407 systemd-logind[1446]: Session 84 logged out. Waiting for processes to exit. Feb 13 08:14:29.636369 systemd[1]: Started sshd@91-145.40.90.207:22-139.178.68.195:37320.service. Feb 13 08:14:29.635000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-145.40.90.207:22-139.178.68.195:37320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:29.636978 systemd-logind[1446]: Removed session 84. Feb 13 08:14:29.648000 audit[15538]: NETFILTER_CFG table=filter:99 family=2 entries=38 op=nft_register_rule pid=15538 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:14:29.648000 audit[15538]: SYSCALL arch=c000003e syscall=46 success=yes exit=13404 a0=3 a1=7ffc49193b90 a2=0 a3=7ffc49193b7c items=0 ppid=2822 pid=15538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:29.648000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:14:29.649000 audit[15538]: NETFILTER_CFG table=nat:100 family=2 entries=14 op=nft_register_rule pid=15538 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:14:29.649000 audit[15538]: SYSCALL arch=c000003e syscall=46 success=yes exit=3300 a0=3 a1=7ffc49193b90 a2=0 a3=31030 items=0 ppid=2822 pid=15538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:29.649000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:14:29.672000 audit[15535]: USER_ACCT pid=15535 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.673470 sshd[15535]: Accepted publickey for core from 139.178.68.195 port 37320 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:29.674000 audit[15535]: CRED_ACQ pid=15535 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.675000 audit[15535]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4191fe00 a2=3 a3=0 items=0 ppid=1 pid=15535 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:29.675000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:29.676472 sshd[15535]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:29.685825 systemd-logind[1446]: New session 85 of user core. Feb 13 08:14:29.688854 systemd[1]: Started session-85.scope. Feb 13 08:14:29.701000 audit[15535]: USER_START pid=15535 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.704000 audit[15539]: CRED_ACQ pid=15539 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.969589 sshd[15535]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:29.971000 audit[15535]: USER_END pid=15535 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.971000 audit[15535]: CRED_DISP pid=15535 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:29.976670 systemd[1]: sshd@91-145.40.90.207:22-139.178.68.195:37320.service: Deactivated successfully. Feb 13 08:14:29.976000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-145.40.90.207:22-139.178.68.195:37320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:29.978406 systemd[1]: session-85.scope: Deactivated successfully. Feb 13 08:14:29.980201 systemd-logind[1446]: Session 85 logged out. Waiting for processes to exit. Feb 13 08:14:29.983220 systemd[1]: Started sshd@92-145.40.90.207:22-139.178.68.195:37336.service. Feb 13 08:14:29.983000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-145.40.90.207:22-139.178.68.195:37336 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:29.985660 systemd-logind[1446]: Removed session 85. Feb 13 08:14:30.051000 audit[15561]: USER_ACCT pid=15561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:30.052126 sshd[15561]: Accepted publickey for core from 139.178.68.195 port 37336 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:30.053000 audit[15561]: CRED_ACQ pid=15561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:30.053000 audit[15561]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa51297f0 a2=3 a3=0 items=0 ppid=1 pid=15561 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:30.053000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:30.054852 sshd[15561]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:30.064675 systemd-logind[1446]: New session 86 of user core. Feb 13 08:14:30.067622 systemd[1]: Started session-86.scope. Feb 13 08:14:30.081000 audit[15561]: USER_START pid=15561 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:30.084000 audit[15564]: CRED_ACQ pid=15564 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:30.210213 sshd[15561]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:30.210000 audit[15561]: USER_END pid=15561 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:30.210000 audit[15561]: CRED_DISP pid=15561 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:30.211676 systemd[1]: sshd@92-145.40.90.207:22-139.178.68.195:37336.service: Deactivated successfully. Feb 13 08:14:30.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-145.40.90.207:22-139.178.68.195:37336 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:30.212103 systemd[1]: session-86.scope: Deactivated successfully. Feb 13 08:14:30.212496 systemd-logind[1446]: Session 86 logged out. Waiting for processes to exit. Feb 13 08:14:30.213143 systemd-logind[1446]: Removed session 86. Feb 13 08:14:30.706440 env[1458]: time="2024-02-13T08:14:30.706343798Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:14:30.733747 env[1458]: time="2024-02-13T08:14:30.733710143Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:30.733977 kubelet[2569]: E0213 08:14:30.733937 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:14:30.733977 kubelet[2569]: E0213 08:14:30.733962 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:14:30.734156 kubelet[2569]: E0213 08:14:30.733984 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:30.734156 kubelet[2569]: E0213 08:14:30.734003 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:14:35.214209 systemd[1]: Started sshd@93-145.40.90.207:22-139.178.68.195:37340.service. Feb 13 08:14:35.213000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-145.40.90.207:22-139.178.68.195:37340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:35.241522 kernel: kauditd_printk_skb: 57 callbacks suppressed Feb 13 08:14:35.241561 kernel: audit: type=1130 audit(1707812075.213:2112): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-145.40.90.207:22-139.178.68.195:37340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:35.359000 audit[15617]: USER_ACCT pid=15617 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.360827 sshd[15617]: Accepted publickey for core from 139.178.68.195 port 37340 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:35.364312 sshd[15617]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:35.374374 systemd-logind[1446]: New session 87 of user core. Feb 13 08:14:35.377138 systemd[1]: Started session-87.scope. Feb 13 08:14:35.362000 audit[15617]: CRED_ACQ pid=15617 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.473949 sshd[15617]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:35.475370 systemd[1]: sshd@93-145.40.90.207:22-139.178.68.195:37340.service: Deactivated successfully. Feb 13 08:14:35.475799 systemd[1]: session-87.scope: Deactivated successfully. Feb 13 08:14:35.476214 systemd-logind[1446]: Session 87 logged out. Waiting for processes to exit. Feb 13 08:14:35.476629 systemd-logind[1446]: Removed session 87. Feb 13 08:14:35.542883 kernel: audit: type=1101 audit(1707812075.359:2113): pid=15617 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.542923 kernel: audit: type=1103 audit(1707812075.362:2114): pid=15617 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.542939 kernel: audit: type=1006 audit(1707812075.362:2115): pid=15617 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=87 res=1 Feb 13 08:14:35.601578 kernel: audit: type=1300 audit(1707812075.362:2115): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6d000690 a2=3 a3=0 items=0 ppid=1 pid=15617 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:35.362000 audit[15617]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6d000690 a2=3 a3=0 items=0 ppid=1 pid=15617 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:35.693668 kernel: audit: type=1327 audit(1707812075.362:2115): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:35.362000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:35.724151 kernel: audit: type=1105 audit(1707812075.385:2116): pid=15617 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.385000 audit[15617]: USER_START pid=15617 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.386000 audit[15619]: CRED_ACQ pid=15619 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.908005 kernel: audit: type=1103 audit(1707812075.386:2117): pid=15619 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.908040 kernel: audit: type=1106 audit(1707812075.473:2118): pid=15617 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.473000 audit[15617]: USER_END pid=15617 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:36.003635 kernel: audit: type=1104 audit(1707812075.474:2119): pid=15617 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.474000 audit[15617]: CRED_DISP pid=15617 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:35.474000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-145.40.90.207:22-139.178.68.195:37340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:37.707756 env[1458]: time="2024-02-13T08:14:37.707601980Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:14:37.736967 env[1458]: time="2024-02-13T08:14:37.736888518Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:37.737165 kubelet[2569]: E0213 08:14:37.737136 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:14:37.737339 kubelet[2569]: E0213 08:14:37.737176 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:14:37.737339 kubelet[2569]: E0213 08:14:37.737199 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:37.737339 kubelet[2569]: E0213 08:14:37.737215 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:14:40.483662 systemd[1]: Started sshd@94-145.40.90.207:22-139.178.68.195:53344.service. Feb 13 08:14:40.483000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.90.207:22-139.178.68.195:53344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:40.510779 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:14:40.510843 kernel: audit: type=1130 audit(1707812080.483:2121): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.90.207:22-139.178.68.195:53344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:40.627000 audit[15669]: USER_ACCT pid=15669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.627948 sshd[15669]: Accepted publickey for core from 139.178.68.195 port 53344 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:40.629908 sshd[15669]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:40.632294 systemd-logind[1446]: New session 88 of user core. Feb 13 08:14:40.632809 systemd[1]: Started session-88.scope. Feb 13 08:14:40.706006 env[1458]: time="2024-02-13T08:14:40.705979166Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:14:40.712919 sshd[15669]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:40.714459 systemd[1]: sshd@94-145.40.90.207:22-139.178.68.195:53344.service: Deactivated successfully. Feb 13 08:14:40.714926 systemd[1]: session-88.scope: Deactivated successfully. Feb 13 08:14:40.715328 systemd-logind[1446]: Session 88 logged out. Waiting for processes to exit. Feb 13 08:14:40.715906 systemd-logind[1446]: Removed session 88. Feb 13 08:14:40.719409 env[1458]: time="2024-02-13T08:14:40.719380913Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:40.719531 kubelet[2569]: E0213 08:14:40.719520 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:14:40.719724 kubelet[2569]: E0213 08:14:40.719549 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:14:40.719724 kubelet[2569]: E0213 08:14:40.719577 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:40.719724 kubelet[2569]: E0213 08:14:40.719600 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:14:40.629000 audit[15669]: CRED_ACQ pid=15669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.812991 kernel: audit: type=1101 audit(1707812080.627:2122): pid=15669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.813034 kernel: audit: type=1103 audit(1707812080.629:2123): pid=15669 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.813052 kernel: audit: type=1006 audit(1707812080.629:2124): pid=15669 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=88 res=1 Feb 13 08:14:40.871688 kernel: audit: type=1300 audit(1707812080.629:2124): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff70f73210 a2=3 a3=0 items=0 ppid=1 pid=15669 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:40.629000 audit[15669]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff70f73210 a2=3 a3=0 items=0 ppid=1 pid=15669 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:40.963797 kernel: audit: type=1327 audit(1707812080.629:2124): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:40.629000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:40.994277 kernel: audit: type=1105 audit(1707812080.634:2125): pid=15669 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.634000 audit[15669]: USER_START pid=15669 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:41.088848 kernel: audit: type=1103 audit(1707812080.634:2126): pid=15671 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.634000 audit[15671]: CRED_ACQ pid=15671 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:41.178139 kernel: audit: type=1106 audit(1707812080.712:2127): pid=15669 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.712000 audit[15669]: USER_END pid=15669 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:41.273812 kernel: audit: type=1104 audit(1707812080.713:2128): pid=15669 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.713000 audit[15669]: CRED_DISP pid=15669 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:40.713000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.90.207:22-139.178.68.195:53344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:41.706838 env[1458]: time="2024-02-13T08:14:41.706716584Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:14:41.706838 env[1458]: time="2024-02-13T08:14:41.706722088Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:14:41.733399 env[1458]: time="2024-02-13T08:14:41.733315909Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:41.733399 env[1458]: time="2024-02-13T08:14:41.733369478Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:41.733563 kubelet[2569]: E0213 08:14:41.733532 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:14:41.733563 kubelet[2569]: E0213 08:14:41.733558 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:14:41.733756 kubelet[2569]: E0213 08:14:41.733584 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:41.733756 kubelet[2569]: E0213 08:14:41.733532 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:14:41.733756 kubelet[2569]: E0213 08:14:41.733603 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:14:41.733756 kubelet[2569]: E0213 08:14:41.733613 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:14:41.733872 kubelet[2569]: E0213 08:14:41.733636 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:41.733872 kubelet[2569]: E0213 08:14:41.733651 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:14:45.724574 systemd[1]: Started sshd@95-145.40.90.207:22-139.178.68.195:53348.service. Feb 13 08:14:45.724000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-145.40.90.207:22-139.178.68.195:53348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:45.767462 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:14:45.767564 kernel: audit: type=1130 audit(1707812085.724:2130): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-145.40.90.207:22-139.178.68.195:53348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:45.884000 audit[15779]: USER_ACCT pid=15779 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:45.885580 sshd[15779]: Accepted publickey for core from 139.178.68.195 port 53348 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:45.886851 sshd[15779]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:45.889329 systemd-logind[1446]: New session 89 of user core. Feb 13 08:14:45.889915 systemd[1]: Started session-89.scope. Feb 13 08:14:45.969251 sshd[15779]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:45.970620 systemd[1]: sshd@95-145.40.90.207:22-139.178.68.195:53348.service: Deactivated successfully. Feb 13 08:14:45.971175 systemd[1]: session-89.scope: Deactivated successfully. Feb 13 08:14:45.971485 systemd-logind[1446]: Session 89 logged out. Waiting for processes to exit. Feb 13 08:14:45.971917 systemd-logind[1446]: Removed session 89. Feb 13 08:14:45.886000 audit[15779]: CRED_ACQ pid=15779 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:46.068974 kernel: audit: type=1101 audit(1707812085.884:2131): pid=15779 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:46.069063 kernel: audit: type=1103 audit(1707812085.886:2132): pid=15779 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:46.069087 kernel: audit: type=1006 audit(1707812085.886:2133): pid=15779 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=89 res=1 Feb 13 08:14:45.886000 audit[15779]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3968d0a0 a2=3 a3=0 items=0 ppid=1 pid=15779 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:46.127715 kernel: audit: type=1300 audit(1707812085.886:2133): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3968d0a0 a2=3 a3=0 items=0 ppid=1 pid=15779 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:45.886000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:46.250135 kernel: audit: type=1327 audit(1707812085.886:2133): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:46.250166 kernel: audit: type=1105 audit(1707812085.891:2134): pid=15779 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:45.891000 audit[15779]: USER_START pid=15779 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:45.891000 audit[15781]: CRED_ACQ pid=15781 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:46.434042 kernel: audit: type=1103 audit(1707812085.891:2135): pid=15781 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:46.434083 kernel: audit: type=1106 audit(1707812085.969:2136): pid=15779 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:45.969000 audit[15779]: USER_END pid=15779 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:45.969000 audit[15779]: CRED_DISP pid=15779 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:46.618894 kernel: audit: type=1104 audit(1707812085.969:2137): pid=15779 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:45.970000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-145.40.90.207:22-139.178.68.195:53348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:46.123000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:46.123000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114ece0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:46.123000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:46.123000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:46.123000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000f4ea50 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:46.123000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:46.192000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:46.192000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e502d20 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:14:46.192000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:14:46.192000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:46.192000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c008db69c0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:14:46.192000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:14:46.192000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:46.192000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=60 a1=c00e84a040 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:14:46.192000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:14:46.945000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:46.945000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:46.945000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c00863c800 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:14:46.945000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00a76dc50 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:14:46.945000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:14:46.945000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:14:46.945000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:46.945000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=62 a1=c00806abd0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:14:46.945000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:14:50.878000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:50.922285 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:14:50.922399 kernel: audit: type=1400 audit(1707812090.878:2147): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:50.972532 systemd[1]: Started sshd@96-145.40.90.207:22-139.178.68.195:38394.service. Feb 13 08:14:50.878000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114f480 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:51.129951 kernel: audit: type=1300 audit(1707812090.878:2147): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00114f480 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:51.129985 kernel: audit: type=1327 audit(1707812090.878:2147): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:50.878000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:51.157687 sshd[15804]: Accepted publickey for core from 139.178.68.195 port 38394 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:51.158941 sshd[15804]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:51.161269 systemd-logind[1446]: New session 90 of user core. Feb 13 08:14:51.161733 systemd[1]: Started session-90.scope. Feb 13 08:14:51.222477 kernel: audit: type=1400 audit(1707812090.878:2148): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:50.878000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:51.240993 sshd[15804]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:51.242331 systemd[1]: sshd@96-145.40.90.207:22-139.178.68.195:38394.service: Deactivated successfully. Feb 13 08:14:51.242765 systemd[1]: session-90.scope: Deactivated successfully. Feb 13 08:14:51.243119 systemd-logind[1446]: Session 90 logged out. Waiting for processes to exit. Feb 13 08:14:51.243546 systemd-logind[1446]: Removed session 90. Feb 13 08:14:51.311868 kernel: audit: type=1300 audit(1707812090.878:2148): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001000620 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:50.878000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001000620 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:51.431261 kernel: audit: type=1327 audit(1707812090.878:2148): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:50.878000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:51.524150 kernel: audit: type=1400 audit(1707812090.878:2149): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:50.878000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:51.614254 kernel: audit: type=1300 audit(1707812090.878:2149): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0029ec840 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:50.878000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0029ec840 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:51.705146 env[1458]: time="2024-02-13T08:14:51.705086431Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:14:51.716445 env[1458]: time="2024-02-13T08:14:51.716386434Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:51.716589 kubelet[2569]: E0213 08:14:51.716577 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:14:51.716784 kubelet[2569]: E0213 08:14:51.716607 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:14:51.716784 kubelet[2569]: E0213 08:14:51.716644 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:51.716784 kubelet[2569]: E0213 08:14:51.716670 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:14:51.734459 kernel: audit: type=1327 audit(1707812090.878:2149): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:50.878000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:51.827433 kernel: audit: type=1400 audit(1707812090.883:2150): avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:50.883000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:14:50.883000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001000760 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:14:50.883000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:14:50.971000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-145.40.90.207:22-139.178.68.195:38394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:51.156000 audit[15804]: USER_ACCT pid=15804 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:51.158000 audit[15804]: CRED_ACQ pid=15804 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:51.158000 audit[15804]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd9dea1730 a2=3 a3=0 items=0 ppid=1 pid=15804 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:51.158000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:51.163000 audit[15804]: USER_START pid=15804 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:51.164000 audit[15806]: CRED_ACQ pid=15806 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:51.240000 audit[15804]: USER_END pid=15804 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:51.241000 audit[15804]: CRED_DISP pid=15804 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:51.241000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-145.40.90.207:22-139.178.68.195:38394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:52.706232 env[1458]: time="2024-02-13T08:14:52.706143503Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:14:52.731294 env[1458]: time="2024-02-13T08:14:52.731260205Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:52.731512 kubelet[2569]: E0213 08:14:52.731501 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:14:52.731703 kubelet[2569]: E0213 08:14:52.731527 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:14:52.731703 kubelet[2569]: E0213 08:14:52.731549 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:52.731703 kubelet[2569]: E0213 08:14:52.731566 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:14:53.706130 env[1458]: time="2024-02-13T08:14:53.706046506Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:14:53.721679 env[1458]: time="2024-02-13T08:14:53.721590872Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:53.721901 kubelet[2569]: E0213 08:14:53.721782 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:14:53.721901 kubelet[2569]: E0213 08:14:53.721810 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:14:53.721901 kubelet[2569]: E0213 08:14:53.721834 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:53.721901 kubelet[2569]: E0213 08:14:53.721853 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:14:54.140083 systemd[1]: Started sshd@97-145.40.90.207:22-43.153.220.201:55114.service. Feb 13 08:14:54.139000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-145.40.90.207:22-43.153.220.201:55114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:54.706476 env[1458]: time="2024-02-13T08:14:54.706334166Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:14:54.736169 env[1458]: time="2024-02-13T08:14:54.736063670Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:14:54.736443 kubelet[2569]: E0213 08:14:54.736313 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:14:54.736443 kubelet[2569]: E0213 08:14:54.736351 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:14:54.736443 kubelet[2569]: E0213 08:14:54.736374 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:14:54.736443 kubelet[2569]: E0213 08:14:54.736390 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:14:55.211352 sshd[15914]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.220.201 user=root Feb 13 08:14:55.210000 audit[15914]: USER_AUTH pid=15914 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.220.201 addr=43.153.220.201 terminal=ssh res=failed' Feb 13 08:14:56.244519 systemd[1]: Started sshd@98-145.40.90.207:22-139.178.68.195:59346.service. Feb 13 08:14:56.244000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-145.40.90.207:22-139.178.68.195:59346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:56.271890 kernel: kauditd_printk_skb: 15 callbacks suppressed Feb 13 08:14:56.271955 kernel: audit: type=1130 audit(1707812096.244:2162): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-145.40.90.207:22-139.178.68.195:59346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:56.390000 audit[15947]: USER_ACCT pid=15947 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.391559 sshd[15947]: Accepted publickey for core from 139.178.68.195 port 59346 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:14:56.394901 sshd[15947]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:14:56.402372 systemd-logind[1446]: New session 91 of user core. Feb 13 08:14:56.404302 systemd[1]: Started session-91.scope. Feb 13 08:14:56.393000 audit[15947]: CRED_ACQ pid=15947 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.491561 sshd[15947]: pam_unix(sshd:session): session closed for user core Feb 13 08:14:56.492928 systemd[1]: sshd@98-145.40.90.207:22-139.178.68.195:59346.service: Deactivated successfully. Feb 13 08:14:56.493358 systemd[1]: session-91.scope: Deactivated successfully. Feb 13 08:14:56.493635 systemd-logind[1446]: Session 91 logged out. Waiting for processes to exit. Feb 13 08:14:56.494194 systemd-logind[1446]: Removed session 91. Feb 13 08:14:56.574433 kernel: audit: type=1101 audit(1707812096.390:2163): pid=15947 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.574471 kernel: audit: type=1103 audit(1707812096.393:2164): pid=15947 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.574490 kernel: audit: type=1006 audit(1707812096.393:2165): pid=15947 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=91 res=1 Feb 13 08:14:56.633169 kernel: audit: type=1300 audit(1707812096.393:2165): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffed9e3ec50 a2=3 a3=0 items=0 ppid=1 pid=15947 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:56.393000 audit[15947]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffed9e3ec50 a2=3 a3=0 items=0 ppid=1 pid=15947 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:14:56.725892 kernel: audit: type=1327 audit(1707812096.393:2165): proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:56.393000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:14:56.756714 kernel: audit: type=1105 audit(1707812096.412:2166): pid=15947 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.412000 audit[15947]: USER_START pid=15947 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.852348 kernel: audit: type=1103 audit(1707812096.414:2167): pid=15949 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.414000 audit[15949]: CRED_ACQ pid=15949 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.941766 kernel: audit: type=1106 audit(1707812096.491:2168): pid=15947 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.491000 audit[15947]: USER_END pid=15947 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:57.037649 kernel: audit: type=1104 audit(1707812096.491:2169): pid=15947 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.491000 audit[15947]: CRED_DISP pid=15947 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:14:56.492000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-145.40.90.207:22-139.178.68.195:59346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:14:57.784883 sshd[15914]: Failed password for root from 43.153.220.201 port 55114 ssh2 Feb 13 08:14:58.843912 sshd[15914]: Received disconnect from 43.153.220.201 port 55114:11: Bye Bye [preauth] Feb 13 08:14:58.843912 sshd[15914]: Disconnected from authenticating user root 43.153.220.201 port 55114 [preauth] Feb 13 08:14:58.846429 systemd[1]: sshd@97-145.40.90.207:22-43.153.220.201:55114.service: Deactivated successfully. Feb 13 08:14:58.846000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-145.40.90.207:22-43.153.220.201:55114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:01.500575 systemd[1]: Started sshd@99-145.40.90.207:22-139.178.68.195:59348.service. Feb 13 08:15:01.500000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.90.207:22-139.178.68.195:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:01.527701 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:15:01.527771 kernel: audit: type=1130 audit(1707812101.500:2172): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.90.207:22-139.178.68.195:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:01.644000 audit[15974]: USER_ACCT pid=15974 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.645760 sshd[15974]: Accepted publickey for core from 139.178.68.195 port 59348 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:01.647938 sshd[15974]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:01.650313 systemd-logind[1446]: New session 92 of user core. Feb 13 08:15:01.650864 systemd[1]: Started session-92.scope. Feb 13 08:15:01.731757 sshd[15974]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:01.733570 systemd[1]: sshd@99-145.40.90.207:22-139.178.68.195:59348.service: Deactivated successfully. Feb 13 08:15:01.734365 systemd[1]: session-92.scope: Deactivated successfully. Feb 13 08:15:01.735034 systemd-logind[1446]: Session 92 logged out. Waiting for processes to exit. Feb 13 08:15:01.735575 systemd-logind[1446]: Removed session 92. Feb 13 08:15:01.647000 audit[15974]: CRED_ACQ pid=15974 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.829894 kernel: audit: type=1101 audit(1707812101.644:2173): pid=15974 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.829930 kernel: audit: type=1103 audit(1707812101.647:2174): pid=15974 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.829948 kernel: audit: type=1006 audit(1707812101.647:2175): pid=15974 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=92 res=1 Feb 13 08:15:01.888452 kernel: audit: type=1300 audit(1707812101.647:2175): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4536e4c0 a2=3 a3=0 items=0 ppid=1 pid=15974 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:01.647000 audit[15974]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4536e4c0 a2=3 a3=0 items=0 ppid=1 pid=15974 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:01.647000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:02.010784 kernel: audit: type=1327 audit(1707812101.647:2175): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:02.010820 kernel: audit: type=1105 audit(1707812101.652:2176): pid=15974 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.652000 audit[15974]: USER_START pid=15974 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:02.105336 kernel: audit: type=1103 audit(1707812101.653:2177): pid=15976 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.653000 audit[15976]: CRED_ACQ pid=15976 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:02.194660 kernel: audit: type=1106 audit(1707812101.731:2178): pid=15974 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.731000 audit[15974]: USER_END pid=15974 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:02.290203 kernel: audit: type=1104 audit(1707812101.731:2179): pid=15974 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.731000 audit[15974]: CRED_DISP pid=15974 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:01.733000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.90.207:22-139.178.68.195:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:03.706057 env[1458]: time="2024-02-13T08:15:03.705964960Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:15:03.732266 env[1458]: time="2024-02-13T08:15:03.732231039Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:03.732438 kubelet[2569]: E0213 08:15:03.732402 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:15:03.732438 kubelet[2569]: E0213 08:15:03.732429 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:15:03.732626 kubelet[2569]: E0213 08:15:03.732451 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:03.732626 kubelet[2569]: E0213 08:15:03.732468 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:15:05.706926 env[1458]: time="2024-02-13T08:15:05.706792923Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:15:05.763256 env[1458]: time="2024-02-13T08:15:05.763165459Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:05.763473 kubelet[2569]: E0213 08:15:05.763431 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:15:05.763473 kubelet[2569]: E0213 08:15:05.763472 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:15:05.763851 kubelet[2569]: E0213 08:15:05.763516 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:05.763851 kubelet[2569]: E0213 08:15:05.763551 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:15:06.706622 env[1458]: time="2024-02-13T08:15:06.706529538Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:15:06.733307 env[1458]: time="2024-02-13T08:15:06.733251310Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:06.733586 kubelet[2569]: E0213 08:15:06.733471 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:15:06.733586 kubelet[2569]: E0213 08:15:06.733498 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:15:06.733586 kubelet[2569]: E0213 08:15:06.733522 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:06.733586 kubelet[2569]: E0213 08:15:06.733558 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:15:06.734495 systemd[1]: Started sshd@100-145.40.90.207:22-139.178.68.195:39666.service. Feb 13 08:15:06.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.90.207:22-139.178.68.195:39666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:06.761183 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:06.761218 kernel: audit: type=1130 audit(1707812106.733:2181): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.90.207:22-139.178.68.195:39666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:06.876000 audit[16092]: USER_ACCT pid=16092 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:06.877845 sshd[16092]: Accepted publickey for core from 139.178.68.195 port 39666 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:06.880133 sshd[16092]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:06.885694 systemd-logind[1446]: New session 93 of user core. Feb 13 08:15:06.887300 systemd[1]: Started session-93.scope. Feb 13 08:15:06.878000 audit[16092]: CRED_ACQ pid=16092 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:06.971257 sshd[16092]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:06.972754 systemd[1]: sshd@100-145.40.90.207:22-139.178.68.195:39666.service: Deactivated successfully. Feb 13 08:15:06.973179 systemd[1]: session-93.scope: Deactivated successfully. Feb 13 08:15:06.973499 systemd-logind[1446]: Session 93 logged out. Waiting for processes to exit. Feb 13 08:15:06.974050 systemd-logind[1446]: Removed session 93. Feb 13 08:15:07.059861 kernel: audit: type=1101 audit(1707812106.876:2182): pid=16092 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:07.059899 kernel: audit: type=1103 audit(1707812106.878:2183): pid=16092 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:07.059919 kernel: audit: type=1006 audit(1707812106.879:2184): pid=16092 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=93 res=1 Feb 13 08:15:07.118530 kernel: audit: type=1300 audit(1707812106.879:2184): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc52169b80 a2=3 a3=0 items=0 ppid=1 pid=16092 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:06.879000 audit[16092]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc52169b80 a2=3 a3=0 items=0 ppid=1 pid=16092 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:07.210611 kernel: audit: type=1327 audit(1707812106.879:2184): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:06.879000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:07.241120 kernel: audit: type=1105 audit(1707812106.893:2185): pid=16092 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:06.893000 audit[16092]: USER_START pid=16092 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:07.335662 kernel: audit: type=1103 audit(1707812106.894:2186): pid=16094 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:06.894000 audit[16094]: CRED_ACQ pid=16094 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:07.424956 kernel: audit: type=1106 audit(1707812106.971:2187): pid=16092 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:06.971000 audit[16092]: USER_END pid=16092 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:07.520533 kernel: audit: type=1104 audit(1707812106.971:2188): pid=16092 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:06.971000 audit[16092]: CRED_DISP pid=16092 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:06.972000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.90.207:22-139.178.68.195:39666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:08.706055 env[1458]: time="2024-02-13T08:15:08.705935856Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:15:08.733174 env[1458]: time="2024-02-13T08:15:08.733069353Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:08.733361 kubelet[2569]: E0213 08:15:08.733350 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:15:08.733519 kubelet[2569]: E0213 08:15:08.733377 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:15:08.733519 kubelet[2569]: E0213 08:15:08.733399 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:08.733519 kubelet[2569]: E0213 08:15:08.733415 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:15:11.980367 systemd[1]: Started sshd@101-145.40.90.207:22-139.178.68.195:39670.service. Feb 13 08:15:11.979000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.90.207:22-139.178.68.195:39670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:12.007389 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:12.007483 kernel: audit: type=1130 audit(1707812111.979:2190): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.90.207:22-139.178.68.195:39670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:12.123000 audit[16148]: USER_ACCT pid=16148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.124152 sshd[16148]: Accepted publickey for core from 139.178.68.195 port 39670 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:12.125499 sshd[16148]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:12.130062 systemd-logind[1446]: New session 94 of user core. Feb 13 08:15:12.130992 systemd[1]: Started session-94.scope. Feb 13 08:15:12.211011 sshd[16148]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:12.212294 systemd[1]: sshd@101-145.40.90.207:22-139.178.68.195:39670.service: Deactivated successfully. Feb 13 08:15:12.212735 systemd[1]: session-94.scope: Deactivated successfully. Feb 13 08:15:12.213143 systemd-logind[1446]: Session 94 logged out. Waiting for processes to exit. Feb 13 08:15:12.213569 systemd-logind[1446]: Removed session 94. Feb 13 08:15:12.124000 audit[16148]: CRED_ACQ pid=16148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.306046 kernel: audit: type=1101 audit(1707812112.123:2191): pid=16148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.306087 kernel: audit: type=1103 audit(1707812112.124:2192): pid=16148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.306106 kernel: audit: type=1006 audit(1707812112.124:2193): pid=16148 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=94 res=1 Feb 13 08:15:12.364716 kernel: audit: type=1300 audit(1707812112.124:2193): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc67d5b040 a2=3 a3=0 items=0 ppid=1 pid=16148 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:12.124000 audit[16148]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc67d5b040 a2=3 a3=0 items=0 ppid=1 pid=16148 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:12.456774 kernel: audit: type=1327 audit(1707812112.124:2193): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:12.124000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:12.132000 audit[16148]: USER_START pid=16148 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.581764 kernel: audit: type=1105 audit(1707812112.132:2194): pid=16148 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.581799 kernel: audit: type=1103 audit(1707812112.132:2195): pid=16150 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.132000 audit[16150]: CRED_ACQ pid=16150 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.671064 kernel: audit: type=1106 audit(1707812112.210:2196): pid=16148 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.210000 audit[16148]: USER_END pid=16148 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.766645 kernel: audit: type=1104 audit(1707812112.211:2197): pid=16148 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.211000 audit[16148]: CRED_DISP pid=16148 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:12.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.90.207:22-139.178.68.195:39670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:16.705656 env[1458]: time="2024-02-13T08:15:16.705592358Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:15:16.722563 env[1458]: time="2024-02-13T08:15:16.722527665Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:16.722765 kubelet[2569]: E0213 08:15:16.722724 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:15:16.722765 kubelet[2569]: E0213 08:15:16.722758 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:15:16.722965 kubelet[2569]: E0213 08:15:16.722792 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:16.722965 kubelet[2569]: E0213 08:15:16.722823 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:15:17.222357 systemd[1]: Started sshd@102-145.40.90.207:22-139.178.68.195:34174.service. Feb 13 08:15:17.222000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.90.207:22-139.178.68.195:34174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:17.260301 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:17.260401 kernel: audit: type=1130 audit(1707812117.222:2199): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.90.207:22-139.178.68.195:34174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:17.375000 audit[16202]: USER_ACCT pid=16202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.376073 sshd[16202]: Accepted publickey for core from 139.178.68.195 port 34174 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:17.378215 sshd[16202]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:17.380441 systemd-logind[1446]: New session 95 of user core. Feb 13 08:15:17.381216 systemd[1]: Started session-95.scope. Feb 13 08:15:17.461205 sshd[16202]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:17.462620 systemd[1]: sshd@102-145.40.90.207:22-139.178.68.195:34174.service: Deactivated successfully. Feb 13 08:15:17.463086 systemd[1]: session-95.scope: Deactivated successfully. Feb 13 08:15:17.463449 systemd-logind[1446]: Session 95 logged out. Waiting for processes to exit. Feb 13 08:15:17.463902 systemd-logind[1446]: Removed session 95. Feb 13 08:15:17.377000 audit[16202]: CRED_ACQ pid=16202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.558077 kernel: audit: type=1101 audit(1707812117.375:2200): pid=16202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.558115 kernel: audit: type=1103 audit(1707812117.377:2201): pid=16202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.558133 kernel: audit: type=1006 audit(1707812117.377:2202): pid=16202 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=95 res=1 Feb 13 08:15:17.616761 kernel: audit: type=1300 audit(1707812117.377:2202): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff08ed93b0 a2=3 a3=0 items=0 ppid=1 pid=16202 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:17.377000 audit[16202]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff08ed93b0 a2=3 a3=0 items=0 ppid=1 pid=16202 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:17.705727 env[1458]: time="2024-02-13T08:15:17.705708690Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:15:17.377000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:17.719120 env[1458]: time="2024-02-13T08:15:17.719062344Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:17.719278 kubelet[2569]: E0213 08:15:17.719267 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:15:17.719345 kubelet[2569]: E0213 08:15:17.719297 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:15:17.719345 kubelet[2569]: E0213 08:15:17.719332 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:17.719413 kubelet[2569]: E0213 08:15:17.719348 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:15:17.739384 kernel: audit: type=1327 audit(1707812117.377:2202): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:17.739466 kernel: audit: type=1105 audit(1707812117.383:2203): pid=16202 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.383000 audit[16202]: USER_START pid=16202 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.384000 audit[16204]: CRED_ACQ pid=16204 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.923348 kernel: audit: type=1103 audit(1707812117.384:2204): pid=16204 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.923391 kernel: audit: type=1106 audit(1707812117.461:2205): pid=16202 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.461000 audit[16202]: USER_END pid=16202 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:18.019083 kernel: audit: type=1104 audit(1707812117.461:2206): pid=16202 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.461000 audit[16202]: CRED_DISP pid=16202 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:17.462000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.90.207:22-139.178.68.195:34174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:20.706242 env[1458]: time="2024-02-13T08:15:20.706132140Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:15:20.755434 env[1458]: time="2024-02-13T08:15:20.755367202Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:20.755712 kubelet[2569]: E0213 08:15:20.755659 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:15:20.755712 kubelet[2569]: E0213 08:15:20.755702 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:15:20.756078 kubelet[2569]: E0213 08:15:20.755743 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:20.756078 kubelet[2569]: E0213 08:15:20.755778 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:15:22.473997 systemd[1]: Started sshd@103-145.40.90.207:22-139.178.68.195:34182.service. Feb 13 08:15:22.474000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.90.207:22-139.178.68.195:34182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:22.501791 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:22.501881 kernel: audit: type=1130 audit(1707812122.474:2208): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.90.207:22-139.178.68.195:34182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:22.618000 audit[16288]: USER_ACCT pid=16288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.619596 sshd[16288]: Accepted publickey for core from 139.178.68.195 port 34182 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:22.621909 sshd[16288]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:22.624347 systemd-logind[1446]: New session 96 of user core. Feb 13 08:15:22.625070 systemd[1]: Started session-96.scope. Feb 13 08:15:22.705234 env[1458]: time="2024-02-13T08:15:22.705186369Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:15:22.621000 audit[16288]: CRED_ACQ pid=16288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.718445 env[1458]: time="2024-02-13T08:15:22.718388110Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:22.718564 kubelet[2569]: E0213 08:15:22.718549 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:15:22.718767 kubelet[2569]: E0213 08:15:22.718590 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:15:22.718767 kubelet[2569]: E0213 08:15:22.718622 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:22.718767 kubelet[2569]: E0213 08:15:22.718715 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:15:22.804847 kernel: audit: type=1101 audit(1707812122.618:2209): pid=16288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.804887 kernel: audit: type=1103 audit(1707812122.621:2210): pid=16288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.804906 kernel: audit: type=1006 audit(1707812122.621:2211): pid=16288 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=96 res=1 Feb 13 08:15:22.839587 sshd[16288]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:22.841110 systemd[1]: sshd@103-145.40.90.207:22-139.178.68.195:34182.service: Deactivated successfully. Feb 13 08:15:22.841574 systemd[1]: session-96.scope: Deactivated successfully. Feb 13 08:15:22.841953 systemd-logind[1446]: Session 96 logged out. Waiting for processes to exit. Feb 13 08:15:22.842374 systemd-logind[1446]: Removed session 96. Feb 13 08:15:22.621000 audit[16288]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2ca91050 a2=3 a3=0 items=0 ppid=1 pid=16288 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:22.955918 kernel: audit: type=1300 audit(1707812122.621:2211): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2ca91050 a2=3 a3=0 items=0 ppid=1 pid=16288 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:22.955953 kernel: audit: type=1327 audit(1707812122.621:2211): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:22.621000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:22.986455 kernel: audit: type=1105 audit(1707812122.626:2212): pid=16288 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.626000 audit[16288]: USER_START pid=16288 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:23.081116 kernel: audit: type=1103 audit(1707812122.627:2213): pid=16290 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.627000 audit[16290]: CRED_ACQ pid=16290 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:23.170475 kernel: audit: type=1106 audit(1707812122.839:2214): pid=16288 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.839000 audit[16288]: USER_END pid=16288 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:23.266140 kernel: audit: type=1104 audit(1707812122.839:2215): pid=16288 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.839000 audit[16288]: CRED_DISP pid=16288 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:22.840000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.90.207:22-139.178.68.195:34182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:27.849430 systemd[1]: Started sshd@104-145.40.90.207:22-139.178.68.195:50666.service. Feb 13 08:15:27.848000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-145.40.90.207:22-139.178.68.195:50666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:27.876560 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:27.876603 kernel: audit: type=1130 audit(1707812127.848:2217): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-145.40.90.207:22-139.178.68.195:50666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:27.993000 audit[16344]: USER_ACCT pid=16344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:27.994499 sshd[16344]: Accepted publickey for core from 139.178.68.195 port 50666 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:27.995925 sshd[16344]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:27.998291 systemd-logind[1446]: New session 97 of user core. Feb 13 08:15:27.998999 systemd[1]: Started session-97.scope. Feb 13 08:15:28.079049 sshd[16344]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:28.080521 systemd[1]: sshd@104-145.40.90.207:22-139.178.68.195:50666.service: Deactivated successfully. Feb 13 08:15:28.080999 systemd[1]: session-97.scope: Deactivated successfully. Feb 13 08:15:28.081365 systemd-logind[1446]: Session 97 logged out. Waiting for processes to exit. Feb 13 08:15:28.082167 systemd-logind[1446]: Removed session 97. Feb 13 08:15:27.995000 audit[16344]: CRED_ACQ pid=16344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.176691 kernel: audit: type=1101 audit(1707812127.993:2218): pid=16344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.176738 kernel: audit: type=1103 audit(1707812127.995:2219): pid=16344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.176761 kernel: audit: type=1006 audit(1707812127.995:2220): pid=16344 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=97 res=1 Feb 13 08:15:28.235429 kernel: audit: type=1300 audit(1707812127.995:2220): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff908720f0 a2=3 a3=0 items=0 ppid=1 pid=16344 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=97 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:27.995000 audit[16344]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff908720f0 a2=3 a3=0 items=0 ppid=1 pid=16344 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=97 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:28.327551 kernel: audit: type=1327 audit(1707812127.995:2220): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:27.995000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:28.358080 kernel: audit: type=1105 audit(1707812128.000:2221): pid=16344 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.000000 audit[16344]: USER_START pid=16344 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.452716 kernel: audit: type=1103 audit(1707812128.001:2222): pid=16346 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.001000 audit[16346]: CRED_ACQ pid=16346 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.542035 kernel: audit: type=1106 audit(1707812128.079:2223): pid=16344 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.079000 audit[16344]: USER_END pid=16344 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.637646 kernel: audit: type=1104 audit(1707812128.079:2224): pid=16344 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.079000 audit[16344]: CRED_DISP pid=16344 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:28.080000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-145.40.90.207:22-139.178.68.195:50666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:29.705473 env[1458]: time="2024-02-13T08:15:29.705424449Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:15:29.718187 env[1458]: time="2024-02-13T08:15:29.718147788Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:29.718398 kubelet[2569]: E0213 08:15:29.718359 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:15:29.718398 kubelet[2569]: E0213 08:15:29.718391 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:15:29.718618 kubelet[2569]: E0213 08:15:29.718419 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:29.718618 kubelet[2569]: E0213 08:15:29.718440 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:15:32.706464 env[1458]: time="2024-02-13T08:15:32.706368411Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:15:32.706464 env[1458]: time="2024-02-13T08:15:32.706370823Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:15:32.733028 env[1458]: time="2024-02-13T08:15:32.732958345Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:32.733028 env[1458]: time="2024-02-13T08:15:32.732959299Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:32.733185 kubelet[2569]: E0213 08:15:32.733146 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:15:32.733185 kubelet[2569]: E0213 08:15:32.733174 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:15:32.733352 kubelet[2569]: E0213 08:15:32.733196 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:32.733352 kubelet[2569]: E0213 08:15:32.733146 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:15:32.733352 kubelet[2569]: E0213 08:15:32.733216 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:15:32.733352 kubelet[2569]: E0213 08:15:32.733236 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:15:32.733475 kubelet[2569]: E0213 08:15:32.733259 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:32.733475 kubelet[2569]: E0213 08:15:32.733276 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:15:33.090503 systemd[1]: Started sshd@105-145.40.90.207:22-139.178.68.195:50678.service. Feb 13 08:15:33.090000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-145.40.90.207:22-139.178.68.195:50678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:33.117969 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:33.118073 kernel: audit: type=1130 audit(1707812133.090:2226): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-145.40.90.207:22-139.178.68.195:50678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:33.236525 sshd[16456]: Accepted publickey for core from 139.178.68.195 port 50678 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:33.235000 audit[16456]: USER_ACCT pid=16456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.237906 sshd[16456]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:33.240318 systemd-logind[1446]: New session 98 of user core. Feb 13 08:15:33.240809 systemd[1]: Started session-98.scope. Feb 13 08:15:33.321876 sshd[16456]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:33.323350 systemd[1]: sshd@105-145.40.90.207:22-139.178.68.195:50678.service: Deactivated successfully. Feb 13 08:15:33.323829 systemd[1]: session-98.scope: Deactivated successfully. Feb 13 08:15:33.324256 systemd-logind[1446]: Session 98 logged out. Waiting for processes to exit. Feb 13 08:15:33.325036 systemd-logind[1446]: Removed session 98. Feb 13 08:15:33.237000 audit[16456]: CRED_ACQ pid=16456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.421224 kernel: audit: type=1101 audit(1707812133.235:2227): pid=16456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.421274 kernel: audit: type=1103 audit(1707812133.237:2228): pid=16456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.421296 kernel: audit: type=1006 audit(1707812133.237:2229): pid=16456 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=98 res=1 Feb 13 08:15:33.479918 kernel: audit: type=1300 audit(1707812133.237:2229): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff650b5800 a2=3 a3=0 items=0 ppid=1 pid=16456 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=98 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:33.237000 audit[16456]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff650b5800 a2=3 a3=0 items=0 ppid=1 pid=16456 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=98 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:33.572148 kernel: audit: type=1327 audit(1707812133.237:2229): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:33.237000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:33.602697 kernel: audit: type=1105 audit(1707812133.242:2230): pid=16456 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.242000 audit[16456]: USER_START pid=16456 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.697280 kernel: audit: type=1103 audit(1707812133.243:2231): pid=16458 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.243000 audit[16458]: CRED_ACQ pid=16458 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.786626 kernel: audit: type=1106 audit(1707812133.321:2232): pid=16456 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.321000 audit[16456]: USER_END pid=16456 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.882276 kernel: audit: type=1104 audit(1707812133.321:2233): pid=16456 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.321000 audit[16456]: CRED_DISP pid=16456 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:33.322000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-145.40.90.207:22-139.178.68.195:50678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:35.706234 env[1458]: time="2024-02-13T08:15:35.706138079Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:15:35.726703 env[1458]: time="2024-02-13T08:15:35.726643227Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:35.726881 kubelet[2569]: E0213 08:15:35.726842 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:15:35.726881 kubelet[2569]: E0213 08:15:35.726868 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:15:35.727077 kubelet[2569]: E0213 08:15:35.726890 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:35.727077 kubelet[2569]: E0213 08:15:35.726909 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:15:38.331813 systemd[1]: Started sshd@106-145.40.90.207:22-139.178.68.195:39730.service. Feb 13 08:15:38.331000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-145.40.90.207:22-139.178.68.195:39730 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:38.359002 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:38.359055 kernel: audit: type=1130 audit(1707812138.331:2235): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-145.40.90.207:22-139.178.68.195:39730 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:38.476000 audit[16513]: USER_ACCT pid=16513 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.477750 sshd[16513]: Accepted publickey for core from 139.178.68.195 port 39730 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:38.478926 sshd[16513]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:38.481223 systemd-logind[1446]: New session 99 of user core. Feb 13 08:15:38.481918 systemd[1]: Started session-99.scope. Feb 13 08:15:38.561489 sshd[16513]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:38.562826 systemd[1]: sshd@106-145.40.90.207:22-139.178.68.195:39730.service: Deactivated successfully. Feb 13 08:15:38.563323 systemd[1]: session-99.scope: Deactivated successfully. Feb 13 08:15:38.563617 systemd-logind[1446]: Session 99 logged out. Waiting for processes to exit. Feb 13 08:15:38.564022 systemd-logind[1446]: Removed session 99. Feb 13 08:15:38.478000 audit[16513]: CRED_ACQ pid=16513 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.660044 kernel: audit: type=1101 audit(1707812138.476:2236): pid=16513 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.660085 kernel: audit: type=1103 audit(1707812138.478:2237): pid=16513 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.660105 kernel: audit: type=1006 audit(1707812138.478:2238): pid=16513 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=99 res=1 Feb 13 08:15:38.718758 kernel: audit: type=1300 audit(1707812138.478:2238): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe11d710f0 a2=3 a3=0 items=0 ppid=1 pid=16513 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=99 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:38.478000 audit[16513]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe11d710f0 a2=3 a3=0 items=0 ppid=1 pid=16513 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=99 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:38.810962 kernel: audit: type=1327 audit(1707812138.478:2238): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:38.478000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:38.841482 kernel: audit: type=1105 audit(1707812138.483:2239): pid=16513 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.483000 audit[16513]: USER_START pid=16513 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.936176 kernel: audit: type=1103 audit(1707812138.484:2240): pid=16515 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.484000 audit[16515]: CRED_ACQ pid=16515 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:39.025598 kernel: audit: type=1106 audit(1707812138.561:2241): pid=16513 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.561000 audit[16513]: USER_END pid=16513 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:39.121300 kernel: audit: type=1104 audit(1707812138.561:2242): pid=16513 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.561000 audit[16513]: CRED_DISP pid=16513 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:38.562000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-145.40.90.207:22-139.178.68.195:39730 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:40.706837 env[1458]: time="2024-02-13T08:15:40.706750543Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:15:40.732960 env[1458]: time="2024-02-13T08:15:40.732924489Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:40.733131 kubelet[2569]: E0213 08:15:40.733120 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:15:40.733324 kubelet[2569]: E0213 08:15:40.733148 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:15:40.733324 kubelet[2569]: E0213 08:15:40.733182 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:40.733324 kubelet[2569]: E0213 08:15:40.733210 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:15:43.571736 systemd[1]: Started sshd@107-145.40.90.207:22-139.178.68.195:39742.service. Feb 13 08:15:43.571000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-145.40.90.207:22-139.178.68.195:39742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:43.598908 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:43.598980 kernel: audit: type=1130 audit(1707812143.571:2244): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-145.40.90.207:22-139.178.68.195:39742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:43.715000 audit[16567]: USER_ACCT pid=16567 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.716528 sshd[16567]: Accepted publickey for core from 139.178.68.195 port 39742 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:43.717933 sshd[16567]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:43.720400 systemd-logind[1446]: New session 100 of user core. Feb 13 08:15:43.720847 systemd[1]: Started session-100.scope. Feb 13 08:15:43.799040 sshd[16567]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:43.800402 systemd[1]: sshd@107-145.40.90.207:22-139.178.68.195:39742.service: Deactivated successfully. Feb 13 08:15:43.800835 systemd[1]: session-100.scope: Deactivated successfully. Feb 13 08:15:43.801216 systemd-logind[1446]: Session 100 logged out. Waiting for processes to exit. Feb 13 08:15:43.801695 systemd-logind[1446]: Removed session 100. Feb 13 08:15:43.717000 audit[16567]: CRED_ACQ pid=16567 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.898775 kernel: audit: type=1101 audit(1707812143.715:2245): pid=16567 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.898813 kernel: audit: type=1103 audit(1707812143.717:2246): pid=16567 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.898829 kernel: audit: type=1006 audit(1707812143.717:2247): pid=16567 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=100 res=1 Feb 13 08:15:43.957554 kernel: audit: type=1300 audit(1707812143.717:2247): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd49a783d0 a2=3 a3=0 items=0 ppid=1 pid=16567 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=100 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:43.717000 audit[16567]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd49a783d0 a2=3 a3=0 items=0 ppid=1 pid=16567 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=100 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:44.049812 kernel: audit: type=1327 audit(1707812143.717:2247): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:43.717000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:44.080310 kernel: audit: type=1105 audit(1707812143.722:2248): pid=16567 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.722000 audit[16567]: USER_START pid=16567 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:44.175106 kernel: audit: type=1103 audit(1707812143.722:2249): pid=16569 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.722000 audit[16569]: CRED_ACQ pid=16569 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:44.264492 kernel: audit: type=1106 audit(1707812143.798:2250): pid=16567 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.798000 audit[16567]: USER_END pid=16567 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:44.360247 kernel: audit: type=1104 audit(1707812143.799:2251): pid=16567 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.799000 audit[16567]: CRED_DISP pid=16567 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:43.799000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-145.40.90.207:22-139.178.68.195:39742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:45.707267 env[1458]: time="2024-02-13T08:15:45.707173491Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:15:45.755077 env[1458]: time="2024-02-13T08:15:45.754978069Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:45.755303 kubelet[2569]: E0213 08:15:45.755276 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:15:45.755676 kubelet[2569]: E0213 08:15:45.755331 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:15:45.755676 kubelet[2569]: E0213 08:15:45.755400 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:45.755676 kubelet[2569]: E0213 08:15:45.755439 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:15:46.124000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:46.124000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00085c810 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:15:46.124000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:15:46.124000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:46.124000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0002b7340 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:15:46.124000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:15:46.192000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:46.192000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00976a000 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:15:46.192000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:15:46.193000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:46.193000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:46.193000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c0099b16b0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:15:46.193000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:15:46.193000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c0095c68e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:15:46.193000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:15:46.946000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:46.946000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c00f0d3290 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:15:46.946000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:15:46.946000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:46.946000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00ae944e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:15:46.946000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:15:46.946000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:46.946000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e7f0a20 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:15:46.946000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:15:47.706910 env[1458]: time="2024-02-13T08:15:47.706789098Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:15:47.742822 env[1458]: time="2024-02-13T08:15:47.742748764Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:47.742973 kubelet[2569]: E0213 08:15:47.742951 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:15:47.743230 kubelet[2569]: E0213 08:15:47.742986 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:15:47.743230 kubelet[2569]: E0213 08:15:47.743023 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:47.743230 kubelet[2569]: E0213 08:15:47.743054 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:15:48.803208 systemd[1]: Started sshd@108-145.40.90.207:22-139.178.68.195:35460.service. Feb 13 08:15:48.802000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.90.207:22-139.178.68.195:35460 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:48.829927 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:15:48.830058 kernel: audit: type=1130 audit(1707812148.802:2261): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.90.207:22-139.178.68.195:35460 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:48.946094 sshd[16651]: Accepted publickey for core from 139.178.68.195 port 35460 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:48.945000 audit[16651]: USER_ACCT pid=16651 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:48.948930 sshd[16651]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:48.951306 systemd-logind[1446]: New session 101 of user core. Feb 13 08:15:48.951726 systemd[1]: Started session-101.scope. Feb 13 08:15:49.031941 sshd[16651]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:49.033446 systemd[1]: sshd@108-145.40.90.207:22-139.178.68.195:35460.service: Deactivated successfully. Feb 13 08:15:49.034000 systemd[1]: session-101.scope: Deactivated successfully. Feb 13 08:15:49.034463 systemd-logind[1446]: Session 101 logged out. Waiting for processes to exit. Feb 13 08:15:49.035221 systemd-logind[1446]: Removed session 101. Feb 13 08:15:48.948000 audit[16651]: CRED_ACQ pid=16651 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.131982 kernel: audit: type=1101 audit(1707812148.945:2262): pid=16651 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.132043 kernel: audit: type=1103 audit(1707812148.948:2263): pid=16651 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.132079 kernel: audit: type=1006 audit(1707812148.948:2264): pid=16651 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=101 res=1 Feb 13 08:15:48.948000 audit[16651]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe63895ee0 a2=3 a3=0 items=0 ppid=1 pid=16651 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=101 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:49.283079 kernel: audit: type=1300 audit(1707812148.948:2264): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe63895ee0 a2=3 a3=0 items=0 ppid=1 pid=16651 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=101 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:49.283195 kernel: audit: type=1327 audit(1707812148.948:2264): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:48.948000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:48.953000 audit[16651]: USER_START pid=16651 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.408373 kernel: audit: type=1105 audit(1707812148.953:2265): pid=16651 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.408424 kernel: audit: type=1103 audit(1707812148.953:2266): pid=16653 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:48.953000 audit[16653]: CRED_ACQ pid=16653 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.032000 audit[16651]: USER_END pid=16651 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.593575 kernel: audit: type=1106 audit(1707812149.032:2267): pid=16651 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.593623 kernel: audit: type=1104 audit(1707812149.032:2268): pid=16651 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.032000 audit[16651]: CRED_DISP pid=16651 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:49.032000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.90.207:22-139.178.68.195:35460 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:49.705961 env[1458]: time="2024-02-13T08:15:49.705930369Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:15:49.718560 env[1458]: time="2024-02-13T08:15:49.718501291Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:49.718687 kubelet[2569]: E0213 08:15:49.718650 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:15:49.718687 kubelet[2569]: E0213 08:15:49.718679 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:15:49.719025 kubelet[2569]: E0213 08:15:49.718701 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:49.719025 kubelet[2569]: E0213 08:15:49.718722 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:15:50.878000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:50.878000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:50.878000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0002b7d60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:15:50.878000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:15:50.878000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001b23ae0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:15:50.878000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:15:50.879000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:50.879000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000c24140 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:15:50.879000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:15:50.884000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:15:50.884000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c001c96100 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:15:50.884000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:15:53.706366 env[1458]: time="2024-02-13T08:15:53.706261573Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:15:53.721080 env[1458]: time="2024-02-13T08:15:53.721041058Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:53.721254 kubelet[2569]: E0213 08:15:53.721242 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:15:53.721426 kubelet[2569]: E0213 08:15:53.721271 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:15:53.721426 kubelet[2569]: E0213 08:15:53.721296 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:53.721426 kubelet[2569]: E0213 08:15:53.721316 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:15:54.041771 systemd[1]: Started sshd@109-145.40.90.207:22-139.178.68.195:35472.service. Feb 13 08:15:54.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.90.207:22-139.178.68.195:35472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:54.069027 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:15:54.069144 kernel: audit: type=1130 audit(1707812154.041:2274): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.90.207:22-139.178.68.195:35472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:54.187000 audit[16737]: USER_ACCT pid=16737 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.187884 sshd[16737]: Accepted publickey for core from 139.178.68.195 port 35472 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:54.189918 sshd[16737]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:54.192328 systemd-logind[1446]: New session 102 of user core. Feb 13 08:15:54.192800 systemd[1]: Started session-102.scope. Feb 13 08:15:54.189000 audit[16737]: CRED_ACQ pid=16737 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.283756 sshd[16737]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:54.285154 systemd[1]: sshd@109-145.40.90.207:22-139.178.68.195:35472.service: Deactivated successfully. Feb 13 08:15:54.285615 systemd[1]: session-102.scope: Deactivated successfully. Feb 13 08:15:54.285990 systemd-logind[1446]: Session 102 logged out. Waiting for processes to exit. Feb 13 08:15:54.286463 systemd-logind[1446]: Removed session 102. Feb 13 08:15:54.370128 kernel: audit: type=1101 audit(1707812154.187:2275): pid=16737 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.370165 kernel: audit: type=1103 audit(1707812154.189:2276): pid=16737 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.370184 kernel: audit: type=1006 audit(1707812154.189:2277): pid=16737 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=102 res=1 Feb 13 08:15:54.428974 kernel: audit: type=1300 audit(1707812154.189:2277): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc4e670a0 a2=3 a3=0 items=0 ppid=1 pid=16737 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=102 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:54.189000 audit[16737]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc4e670a0 a2=3 a3=0 items=0 ppid=1 pid=16737 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=102 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:54.521238 kernel: audit: type=1327 audit(1707812154.189:2277): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:54.189000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:54.194000 audit[16737]: USER_START pid=16737 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.646459 kernel: audit: type=1105 audit(1707812154.194:2278): pid=16737 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.646492 kernel: audit: type=1103 audit(1707812154.195:2279): pid=16739 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.195000 audit[16739]: CRED_ACQ pid=16739 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.735874 kernel: audit: type=1106 audit(1707812154.283:2280): pid=16737 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.283000 audit[16737]: USER_END pid=16737 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.831592 kernel: audit: type=1104 audit(1707812154.283:2281): pid=16737 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.283000 audit[16737]: CRED_DISP pid=16737 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:54.284000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.90.207:22-139.178.68.195:35472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:59.290382 systemd[1]: Started sshd@110-145.40.90.207:22-139.178.68.195:40502.service. Feb 13 08:15:59.290000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.90.207:22-139.178.68.195:40502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:59.317824 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:15:59.317878 kernel: audit: type=1130 audit(1707812159.290:2283): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.90.207:22-139.178.68.195:40502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:59.434000 audit[16761]: USER_ACCT pid=16761 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.435215 sshd[16761]: Accepted publickey for core from 139.178.68.195 port 40502 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:15:59.435925 sshd[16761]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:15:59.438068 systemd-logind[1446]: New session 103 of user core. Feb 13 08:15:59.438720 systemd[1]: Started session-103.scope. Feb 13 08:15:59.518858 sshd[16761]: pam_unix(sshd:session): session closed for user core Feb 13 08:15:59.520354 systemd[1]: sshd@110-145.40.90.207:22-139.178.68.195:40502.service: Deactivated successfully. Feb 13 08:15:59.520927 systemd[1]: session-103.scope: Deactivated successfully. Feb 13 08:15:59.521366 systemd-logind[1446]: Session 103 logged out. Waiting for processes to exit. Feb 13 08:15:59.521794 systemd-logind[1446]: Removed session 103. Feb 13 08:15:59.528650 kernel: audit: type=1101 audit(1707812159.434:2284): pid=16761 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.528735 kernel: audit: type=1103 audit(1707812159.435:2285): pid=16761 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.435000 audit[16761]: CRED_ACQ pid=16761 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.678262 kernel: audit: type=1006 audit(1707812159.435:2286): pid=16761 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=103 res=1 Feb 13 08:15:59.678296 kernel: audit: type=1300 audit(1707812159.435:2286): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc84636490 a2=3 a3=0 items=0 ppid=1 pid=16761 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=103 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:59.435000 audit[16761]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc84636490 a2=3 a3=0 items=0 ppid=1 pid=16761 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=103 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:15:59.705895 env[1458]: time="2024-02-13T08:15:59.705874086Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:15:59.717421 env[1458]: time="2024-02-13T08:15:59.717361402Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:15:59.717549 kubelet[2569]: E0213 08:15:59.717539 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:15:59.717720 kubelet[2569]: E0213 08:15:59.717564 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:15:59.717720 kubelet[2569]: E0213 08:15:59.717586 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:15:59.717720 kubelet[2569]: E0213 08:15:59.717604 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:15:59.729363 systemd[1]: Started sshd@111-145.40.90.207:22-43.153.220.201:45864.service. Feb 13 08:15:59.770549 kernel: audit: type=1327 audit(1707812159.435:2286): proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:59.435000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:15:59.801056 kernel: audit: type=1105 audit(1707812159.440:2287): pid=16761 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.440000 audit[16761]: USER_START pid=16761 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.895893 kernel: audit: type=1103 audit(1707812159.440:2288): pid=16763 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.440000 audit[16763]: CRED_ACQ pid=16763 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.985260 kernel: audit: type=1106 audit(1707812159.518:2289): pid=16761 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.518000 audit[16761]: USER_END pid=16761 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:00.080941 kernel: audit: type=1104 audit(1707812159.518:2290): pid=16761 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.518000 audit[16761]: CRED_DISP pid=16761 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:15:59.519000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.90.207:22-139.178.68.195:40502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:15:59.728000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.90.207:22-43.153.220.201:45864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:00.779124 sshd[16815]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.220.201 user=root Feb 13 08:16:00.778000 audit[16815]: USER_AUTH pid=16815 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.220.201 addr=43.153.220.201 terminal=ssh res=failed' Feb 13 08:16:02.609913 sshd[16815]: Failed password for root from 43.153.220.201 port 45864 ssh2 Feb 13 08:16:02.706111 env[1458]: time="2024-02-13T08:16:02.706029028Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:16:02.706957 env[1458]: time="2024-02-13T08:16:02.706073543Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:16:02.733036 env[1458]: time="2024-02-13T08:16:02.732997937Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:02.733036 env[1458]: time="2024-02-13T08:16:02.733011252Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:02.733232 kubelet[2569]: E0213 08:16:02.733178 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:16:02.733232 kubelet[2569]: E0213 08:16:02.733204 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:16:02.733232 kubelet[2569]: E0213 08:16:02.733226 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:02.733453 kubelet[2569]: E0213 08:16:02.733243 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:16:02.733453 kubelet[2569]: E0213 08:16:02.733178 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:16:02.733453 kubelet[2569]: E0213 08:16:02.733257 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:16:02.733453 kubelet[2569]: E0213 08:16:02.733272 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:02.733565 kubelet[2569]: E0213 08:16:02.733287 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:16:04.397775 sshd[16815]: Received disconnect from 43.153.220.201 port 45864:11: Bye Bye [preauth] Feb 13 08:16:04.397775 sshd[16815]: Disconnected from authenticating user root 43.153.220.201 port 45864 [preauth] Feb 13 08:16:04.400292 systemd[1]: sshd@111-145.40.90.207:22-43.153.220.201:45864.service: Deactivated successfully. Feb 13 08:16:04.400000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.90.207:22-43.153.220.201:45864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:04.427804 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:16:04.427842 kernel: audit: type=1131 audit(1707812164.400:2294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.90.207:22-43.153.220.201:45864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:04.521997 systemd[1]: Started sshd@112-145.40.90.207:22-139.178.68.195:40518.service. Feb 13 08:16:04.521000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.90.207:22-139.178.68.195:40518 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:04.612695 kernel: audit: type=1130 audit(1707812164.521:2295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.90.207:22-139.178.68.195:40518 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:04.639000 audit[16880]: USER_ACCT pid=16880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.640587 sshd[16880]: Accepted publickey for core from 139.178.68.195 port 40518 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:04.641944 sshd[16880]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:04.644192 systemd-logind[1446]: New session 104 of user core. Feb 13 08:16:04.644708 systemd[1]: Started session-104.scope. Feb 13 08:16:04.723756 sshd[16880]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:04.725298 systemd[1]: sshd@112-145.40.90.207:22-139.178.68.195:40518.service: Deactivated successfully. Feb 13 08:16:04.725790 systemd[1]: session-104.scope: Deactivated successfully. Feb 13 08:16:04.726188 systemd-logind[1446]: Session 104 logged out. Waiting for processes to exit. Feb 13 08:16:04.726573 systemd-logind[1446]: Removed session 104. Feb 13 08:16:04.641000 audit[16880]: CRED_ACQ pid=16880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.823363 kernel: audit: type=1101 audit(1707812164.639:2296): pid=16880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.823398 kernel: audit: type=1103 audit(1707812164.641:2297): pid=16880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.823421 kernel: audit: type=1006 audit(1707812164.641:2298): pid=16880 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=104 res=1 Feb 13 08:16:04.882151 kernel: audit: type=1300 audit(1707812164.641:2298): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdcd89bb70 a2=3 a3=0 items=0 ppid=1 pid=16880 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=104 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:04.641000 audit[16880]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdcd89bb70 a2=3 a3=0 items=0 ppid=1 pid=16880 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=104 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:04.974303 kernel: audit: type=1327 audit(1707812164.641:2298): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:04.641000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:05.004827 kernel: audit: type=1105 audit(1707812164.646:2299): pid=16880 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.646000 audit[16880]: USER_START pid=16880 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:05.100318 kernel: audit: type=1103 audit(1707812164.647:2300): pid=16882 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.647000 audit[16882]: CRED_ACQ pid=16882 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.723000 audit[16880]: USER_END pid=16880 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:05.285384 kernel: audit: type=1106 audit(1707812164.723:2301): pid=16880 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.724000 audit[16880]: CRED_DISP pid=16880 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:04.724000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.90.207:22-139.178.68.195:40518 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:05.706405 env[1458]: time="2024-02-13T08:16:05.706212466Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:16:05.735438 env[1458]: time="2024-02-13T08:16:05.735398440Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:05.735613 kubelet[2569]: E0213 08:16:05.735603 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:16:05.735808 kubelet[2569]: E0213 08:16:05.735630 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:16:05.735808 kubelet[2569]: E0213 08:16:05.735694 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:05.735808 kubelet[2569]: E0213 08:16:05.735712 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:16:09.727697 systemd[1]: Started sshd@113-145.40.90.207:22-139.178.68.195:54018.service. Feb 13 08:16:09.727000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.90.207:22-139.178.68.195:54018 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:09.754814 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:16:09.754943 kernel: audit: type=1130 audit(1707812169.727:2304): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.90.207:22-139.178.68.195:54018 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:09.872000 audit[16934]: USER_ACCT pid=16934 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:09.873728 sshd[16934]: Accepted publickey for core from 139.178.68.195 port 54018 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:09.874935 sshd[16934]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:09.877098 systemd-logind[1446]: New session 105 of user core. Feb 13 08:16:09.877575 systemd[1]: Started session-105.scope. Feb 13 08:16:09.874000 audit[16934]: CRED_ACQ pid=16934 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.056896 kernel: audit: type=1101 audit(1707812169.872:2305): pid=16934 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.056940 kernel: audit: type=1103 audit(1707812169.874:2306): pid=16934 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.056965 kernel: audit: type=1006 audit(1707812169.874:2307): pid=16934 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=105 res=1 Feb 13 08:16:10.116030 kernel: audit: type=1300 audit(1707812169.874:2307): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4d1c2960 a2=3 a3=0 items=0 ppid=1 pid=16934 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=105 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:09.874000 audit[16934]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4d1c2960 a2=3 a3=0 items=0 ppid=1 pid=16934 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=105 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:10.208806 kernel: audit: type=1327 audit(1707812169.874:2307): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:09.874000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:10.209047 sshd[16934]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:10.210431 systemd[1]: sshd@113-145.40.90.207:22-139.178.68.195:54018.service: Deactivated successfully. Feb 13 08:16:10.210856 systemd[1]: session-105.scope: Deactivated successfully. Feb 13 08:16:10.211175 systemd-logind[1446]: Session 105 logged out. Waiting for processes to exit. Feb 13 08:16:10.211577 systemd-logind[1446]: Removed session 105. Feb 13 08:16:10.239479 kernel: audit: type=1105 audit(1707812169.879:2308): pid=16934 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:09.879000 audit[16934]: USER_START pid=16934 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.334633 kernel: audit: type=1103 audit(1707812169.879:2309): pid=16936 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:09.879000 audit[16936]: CRED_ACQ pid=16936 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.424505 kernel: audit: type=1106 audit(1707812170.209:2310): pid=16934 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.209000 audit[16934]: USER_END pid=16934 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.520708 kernel: audit: type=1104 audit(1707812170.209:2311): pid=16934 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.209000 audit[16934]: CRED_DISP pid=16934 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:10.209000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.90.207:22-139.178.68.195:54018 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:13.706786 env[1458]: time="2024-02-13T08:16:13.706697428Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:16:13.707840 env[1458]: time="2024-02-13T08:16:13.706734423Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:16:13.726041 env[1458]: time="2024-02-13T08:16:13.725983604Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:13.726234 kubelet[2569]: E0213 08:16:13.726221 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:16:13.726430 kubelet[2569]: E0213 08:16:13.726256 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:16:13.726430 kubelet[2569]: E0213 08:16:13.726294 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:13.726430 kubelet[2569]: E0213 08:16:13.726327 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:16:13.726430 kubelet[2569]: E0213 08:16:13.726372 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:16:13.726430 kubelet[2569]: E0213 08:16:13.726391 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:16:13.726586 env[1458]: time="2024-02-13T08:16:13.726282355Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:13.726614 kubelet[2569]: E0213 08:16:13.726411 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:13.726614 kubelet[2569]: E0213 08:16:13.726427 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:16:14.706072 env[1458]: time="2024-02-13T08:16:14.705972526Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:16:14.761074 env[1458]: time="2024-02-13T08:16:14.760971071Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:14.761526 kubelet[2569]: E0213 08:16:14.761286 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:16:14.761526 kubelet[2569]: E0213 08:16:14.761336 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:16:14.761526 kubelet[2569]: E0213 08:16:14.761395 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:14.761526 kubelet[2569]: E0213 08:16:14.761450 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:16:15.161454 systemd[1]: Started sshd@114-145.40.90.207:22-139.178.68.195:54026.service. Feb 13 08:16:15.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.90.207:22-139.178.68.195:54026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:15.188321 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:16:15.188368 kernel: audit: type=1130 audit(1707812175.161:2313): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.90.207:22-139.178.68.195:54026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:15.304000 audit[17054]: USER_ACCT pid=17054 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.305063 sshd[17054]: Accepted publickey for core from 139.178.68.195 port 54026 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:15.306455 sshd[17054]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:15.308983 systemd-logind[1446]: New session 106 of user core. Feb 13 08:16:15.309418 systemd[1]: Started session-106.scope. Feb 13 08:16:15.388454 sshd[17054]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:15.389922 systemd[1]: sshd@114-145.40.90.207:22-139.178.68.195:54026.service: Deactivated successfully. Feb 13 08:16:15.390342 systemd[1]: session-106.scope: Deactivated successfully. Feb 13 08:16:15.390622 systemd-logind[1446]: Session 106 logged out. Waiting for processes to exit. Feb 13 08:16:15.391126 systemd-logind[1446]: Removed session 106. Feb 13 08:16:15.305000 audit[17054]: CRED_ACQ pid=17054 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.487412 kernel: audit: type=1101 audit(1707812175.304:2314): pid=17054 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.487450 kernel: audit: type=1103 audit(1707812175.305:2315): pid=17054 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.487468 kernel: audit: type=1006 audit(1707812175.305:2316): pid=17054 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=106 res=1 Feb 13 08:16:15.546540 kernel: audit: type=1300 audit(1707812175.305:2316): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd3d567290 a2=3 a3=0 items=0 ppid=1 pid=17054 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=106 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:15.305000 audit[17054]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd3d567290 a2=3 a3=0 items=0 ppid=1 pid=17054 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=106 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:15.639308 kernel: audit: type=1327 audit(1707812175.305:2316): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:15.305000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:15.670016 kernel: audit: type=1105 audit(1707812175.310:2317): pid=17054 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.310000 audit[17054]: USER_START pid=17054 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.311000 audit[17056]: CRED_ACQ pid=17056 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.854720 kernel: audit: type=1103 audit(1707812175.311:2318): pid=17056 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.854804 kernel: audit: type=1106 audit(1707812175.388:2319): pid=17054 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.388000 audit[17054]: USER_END pid=17054 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.950544 kernel: audit: type=1104 audit(1707812175.388:2320): pid=17054 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.388000 audit[17054]: CRED_DISP pid=17054 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:15.389000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.90.207:22-139.178.68.195:54026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:17.163179 systemd[1]: Started sshd@115-145.40.90.207:22-202.188.109.48:51820.service. Feb 13 08:16:17.162000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-145.40.90.207:22-202.188.109.48:51820 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:18.327387 sshd[17082]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=202.188.109.48 user=root Feb 13 08:16:18.327000 audit[17082]: USER_AUTH pid=17082 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=202.188.109.48 addr=202.188.109.48 terminal=ssh res=failed' Feb 13 08:16:19.706735 env[1458]: time="2024-02-13T08:16:19.706573135Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:16:19.735903 env[1458]: time="2024-02-13T08:16:19.735799445Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:19.736508 kubelet[2569]: E0213 08:16:19.736275 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:16:19.736508 kubelet[2569]: E0213 08:16:19.736356 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:16:19.736508 kubelet[2569]: E0213 08:16:19.736404 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:19.736508 kubelet[2569]: E0213 08:16:19.736442 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:16:19.962865 sshd[17082]: Failed password for root from 202.188.109.48 port 51820 ssh2 Feb 13 08:16:20.254882 sshd[17082]: Received disconnect from 202.188.109.48 port 51820:11: Bye Bye [preauth] Feb 13 08:16:20.254882 sshd[17082]: Disconnected from authenticating user root 202.188.109.48 port 51820 [preauth] Feb 13 08:16:20.257529 systemd[1]: sshd@115-145.40.90.207:22-202.188.109.48:51820.service: Deactivated successfully. Feb 13 08:16:20.257000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-145.40.90.207:22-202.188.109.48:51820 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:20.297650 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:16:20.297732 kernel: audit: type=1131 audit(1707812180.257:2324): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-145.40.90.207:22-202.188.109.48:51820 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:20.391749 systemd[1]: Started sshd@116-145.40.90.207:22-139.178.68.195:48502.service. Feb 13 08:16:20.391000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-145.40.90.207:22-139.178.68.195:48502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:20.480713 kernel: audit: type=1130 audit(1707812180.391:2325): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-145.40.90.207:22-139.178.68.195:48502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:20.507000 audit[17116]: USER_ACCT pid=17116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.508503 sshd[17116]: Accepted publickey for core from 139.178.68.195 port 48502 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:20.510909 sshd[17116]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:20.513338 systemd-logind[1446]: New session 107 of user core. Feb 13 08:16:20.514013 systemd[1]: Started session-107.scope. Feb 13 08:16:20.594006 sshd[17116]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:20.595390 systemd[1]: sshd@116-145.40.90.207:22-139.178.68.195:48502.service: Deactivated successfully. Feb 13 08:16:20.595867 systemd[1]: session-107.scope: Deactivated successfully. Feb 13 08:16:20.596236 systemd-logind[1446]: Session 107 logged out. Waiting for processes to exit. Feb 13 08:16:20.596630 systemd-logind[1446]: Removed session 107. Feb 13 08:16:20.510000 audit[17116]: CRED_ACQ pid=17116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.691502 kernel: audit: type=1101 audit(1707812180.507:2326): pid=17116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.691594 kernel: audit: type=1103 audit(1707812180.510:2327): pid=17116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.691612 kernel: audit: type=1006 audit(1707812180.510:2328): pid=17116 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=107 res=1 Feb 13 08:16:20.750423 kernel: audit: type=1300 audit(1707812180.510:2328): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff91467b10 a2=3 a3=0 items=0 ppid=1 pid=17116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=107 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:20.510000 audit[17116]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff91467b10 a2=3 a3=0 items=0 ppid=1 pid=17116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=107 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:20.842717 kernel: audit: type=1327 audit(1707812180.510:2328): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:20.510000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:20.873229 kernel: audit: type=1105 audit(1707812180.515:2329): pid=17116 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.515000 audit[17116]: USER_START pid=17116 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.968883 kernel: audit: type=1103 audit(1707812180.516:2330): pid=17118 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.516000 audit[17118]: CRED_ACQ pid=17118 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.594000 audit[17116]: USER_END pid=17116 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:21.154246 kernel: audit: type=1106 audit(1707812180.594:2331): pid=17116 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.594000 audit[17116]: CRED_DISP pid=17116 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:20.594000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-145.40.90.207:22-139.178.68.195:48502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:25.607292 systemd[1]: Started sshd@117-145.40.90.207:22-139.178.68.195:48508.service. Feb 13 08:16:25.607000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-145.40.90.207:22-139.178.68.195:48508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:25.635214 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:16:25.635271 kernel: audit: type=1130 audit(1707812185.607:2334): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-145.40.90.207:22-139.178.68.195:48508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:25.753000 audit[17141]: USER_ACCT pid=17141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.754657 sshd[17141]: Accepted publickey for core from 139.178.68.195 port 48508 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:25.755936 sshd[17141]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:25.758304 systemd-logind[1446]: New session 108 of user core. Feb 13 08:16:25.758765 systemd[1]: Started session-108.scope. Feb 13 08:16:25.837169 sshd[17141]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:25.838558 systemd[1]: sshd@117-145.40.90.207:22-139.178.68.195:48508.service: Deactivated successfully. Feb 13 08:16:25.838980 systemd[1]: session-108.scope: Deactivated successfully. Feb 13 08:16:25.839335 systemd-logind[1446]: Session 108 logged out. Waiting for processes to exit. Feb 13 08:16:25.839940 systemd-logind[1446]: Removed session 108. Feb 13 08:16:25.755000 audit[17141]: CRED_ACQ pid=17141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.937883 kernel: audit: type=1101 audit(1707812185.753:2335): pid=17141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.937920 kernel: audit: type=1103 audit(1707812185.755:2336): pid=17141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.937940 kernel: audit: type=1006 audit(1707812185.755:2337): pid=17141 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=108 res=1 Feb 13 08:16:25.997070 kernel: audit: type=1300 audit(1707812185.755:2337): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffec262c00 a2=3 a3=0 items=0 ppid=1 pid=17141 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=108 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:25.755000 audit[17141]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffec262c00 a2=3 a3=0 items=0 ppid=1 pid=17141 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=108 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:26.089841 kernel: audit: type=1327 audit(1707812185.755:2337): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:25.755000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:26.120535 kernel: audit: type=1105 audit(1707812185.760:2338): pid=17141 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.760000 audit[17141]: USER_START pid=17141 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:26.215838 kernel: audit: type=1103 audit(1707812185.760:2339): pid=17143 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.760000 audit[17143]: CRED_ACQ pid=17143 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:26.305768 kernel: audit: type=1106 audit(1707812185.837:2340): pid=17141 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.837000 audit[17141]: USER_END pid=17141 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:26.402010 kernel: audit: type=1104 audit(1707812185.837:2341): pid=17141 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.837000 audit[17141]: CRED_DISP pid=17141 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:25.838000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-145.40.90.207:22-139.178.68.195:48508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:26.705772 env[1458]: time="2024-02-13T08:16:26.705657668Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:16:26.722076 env[1458]: time="2024-02-13T08:16:26.722011903Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:26.722245 kubelet[2569]: E0213 08:16:26.722195 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:16:26.722245 kubelet[2569]: E0213 08:16:26.722221 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:16:26.722245 kubelet[2569]: E0213 08:16:26.722243 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:26.722480 kubelet[2569]: E0213 08:16:26.722262 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:16:27.706454 env[1458]: time="2024-02-13T08:16:27.706359150Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:16:27.755218 env[1458]: time="2024-02-13T08:16:27.755162197Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:27.755401 kubelet[2569]: E0213 08:16:27.755383 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:16:27.755723 kubelet[2569]: E0213 08:16:27.755421 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:16:27.755723 kubelet[2569]: E0213 08:16:27.755463 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:27.755723 kubelet[2569]: E0213 08:16:27.755496 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:16:28.706616 env[1458]: time="2024-02-13T08:16:28.706519786Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:16:28.733220 env[1458]: time="2024-02-13T08:16:28.733158217Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:28.733489 kubelet[2569]: E0213 08:16:28.733453 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:16:28.733489 kubelet[2569]: E0213 08:16:28.733477 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:16:28.733555 kubelet[2569]: E0213 08:16:28.733498 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:28.733555 kubelet[2569]: E0213 08:16:28.733515 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:16:30.846671 systemd[1]: Started sshd@118-145.40.90.207:22-139.178.68.195:44086.service. Feb 13 08:16:30.846000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-145.40.90.207:22-139.178.68.195:44086 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:30.874088 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:16:30.874210 kernel: audit: type=1130 audit(1707812190.846:2343): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-145.40.90.207:22-139.178.68.195:44086 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:30.992885 sshd[17249]: Accepted publickey for core from 139.178.68.195 port 44086 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:30.992000 audit[17249]: USER_ACCT pid=17249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:30.994935 sshd[17249]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:30.997132 systemd-logind[1446]: New session 109 of user core. Feb 13 08:16:30.997822 systemd[1]: Started session-109.scope. Feb 13 08:16:31.075405 sshd[17249]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:31.076868 systemd[1]: sshd@118-145.40.90.207:22-139.178.68.195:44086.service: Deactivated successfully. Feb 13 08:16:31.077341 systemd[1]: session-109.scope: Deactivated successfully. Feb 13 08:16:31.077719 systemd-logind[1446]: Session 109 logged out. Waiting for processes to exit. Feb 13 08:16:31.078229 systemd-logind[1446]: Removed session 109. Feb 13 08:16:30.994000 audit[17249]: CRED_ACQ pid=17249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.175395 kernel: audit: type=1101 audit(1707812190.992:2344): pid=17249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.175440 kernel: audit: type=1103 audit(1707812190.994:2345): pid=17249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.175460 kernel: audit: type=1006 audit(1707812190.994:2346): pid=17249 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=109 res=1 Feb 13 08:16:31.234570 kernel: audit: type=1300 audit(1707812190.994:2346): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe8ded7530 a2=3 a3=0 items=0 ppid=1 pid=17249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=109 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:30.994000 audit[17249]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe8ded7530 a2=3 a3=0 items=0 ppid=1 pid=17249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=109 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:30.994000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:31.358045 kernel: audit: type=1327 audit(1707812190.994:2346): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:31.358075 kernel: audit: type=1105 audit(1707812190.999:2347): pid=17249 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:30.999000 audit[17249]: USER_START pid=17249 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.453315 kernel: audit: type=1103 audit(1707812191.000:2348): pid=17251 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.000000 audit[17251]: CRED_ACQ pid=17251 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.542829 kernel: audit: type=1106 audit(1707812191.075:2349): pid=17249 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.075000 audit[17249]: USER_END pid=17249 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.638557 kernel: audit: type=1104 audit(1707812191.075:2350): pid=17249 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.075000 audit[17249]: CRED_DISP pid=17249 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:31.076000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-145.40.90.207:22-139.178.68.195:44086 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:33.706278 env[1458]: time="2024-02-13T08:16:33.706114930Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:16:33.721710 env[1458]: time="2024-02-13T08:16:33.721644004Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:33.721879 kubelet[2569]: E0213 08:16:33.721839 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:16:33.721879 kubelet[2569]: E0213 08:16:33.721864 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:16:33.722083 kubelet[2569]: E0213 08:16:33.721888 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:33.722083 kubelet[2569]: E0213 08:16:33.721907 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:16:36.085512 systemd[1]: Started sshd@119-145.40.90.207:22-139.178.68.195:58406.service. Feb 13 08:16:36.085000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-145.40.90.207:22-139.178.68.195:58406 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:36.112833 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:16:36.112891 kernel: audit: type=1130 audit(1707812196.085:2352): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-145.40.90.207:22-139.178.68.195:58406 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:36.230000 audit[17307]: USER_ACCT pid=17307 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.231622 sshd[17307]: Accepted publickey for core from 139.178.68.195 port 58406 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:36.232764 sshd[17307]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:36.235070 systemd-logind[1446]: New session 110 of user core. Feb 13 08:16:36.235791 systemd[1]: Started session-110.scope. Feb 13 08:16:36.315421 sshd[17307]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:36.316868 systemd[1]: sshd@119-145.40.90.207:22-139.178.68.195:58406.service: Deactivated successfully. Feb 13 08:16:36.317348 systemd[1]: session-110.scope: Deactivated successfully. Feb 13 08:16:36.317722 systemd-logind[1446]: Session 110 logged out. Waiting for processes to exit. Feb 13 08:16:36.318147 systemd-logind[1446]: Removed session 110. Feb 13 08:16:36.231000 audit[17307]: CRED_ACQ pid=17307 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.413713 kernel: audit: type=1101 audit(1707812196.230:2353): pid=17307 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.413759 kernel: audit: type=1103 audit(1707812196.231:2354): pid=17307 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.413781 kernel: audit: type=1006 audit(1707812196.231:2355): pid=17307 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=110 res=1 Feb 13 08:16:36.472553 kernel: audit: type=1300 audit(1707812196.231:2355): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe805b1a20 a2=3 a3=0 items=0 ppid=1 pid=17307 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=110 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:36.231000 audit[17307]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe805b1a20 a2=3 a3=0 items=0 ppid=1 pid=17307 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=110 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:36.564913 kernel: audit: type=1327 audit(1707812196.231:2355): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:36.231000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:36.595413 kernel: audit: type=1105 audit(1707812196.237:2356): pid=17307 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.237000 audit[17307]: USER_START pid=17307 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.690081 kernel: audit: type=1103 audit(1707812196.237:2357): pid=17309 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.237000 audit[17309]: CRED_ACQ pid=17309 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.779489 kernel: audit: type=1106 audit(1707812196.315:2358): pid=17307 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.315000 audit[17307]: USER_END pid=17307 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.875196 kernel: audit: type=1104 audit(1707812196.315:2359): pid=17307 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.315000 audit[17307]: CRED_DISP pid=17307 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:36.316000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-145.40.90.207:22-139.178.68.195:58406 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:39.709620 env[1458]: time="2024-02-13T08:16:39.709571676Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:16:39.722602 env[1458]: time="2024-02-13T08:16:39.722538383Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:39.722829 kubelet[2569]: E0213 08:16:39.722778 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:16:39.722829 kubelet[2569]: E0213 08:16:39.722808 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:16:39.723110 kubelet[2569]: E0213 08:16:39.722834 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:39.723110 kubelet[2569]: E0213 08:16:39.722856 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:16:40.706808 env[1458]: time="2024-02-13T08:16:40.706655516Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:16:40.706808 env[1458]: time="2024-02-13T08:16:40.706763783Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:16:40.736576 env[1458]: time="2024-02-13T08:16:40.736541090Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:40.736932 kubelet[2569]: E0213 08:16:40.736783 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:16:40.736932 kubelet[2569]: E0213 08:16:40.736820 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:16:40.736932 kubelet[2569]: E0213 08:16:40.736846 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:40.736932 kubelet[2569]: E0213 08:16:40.736879 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:16:40.737189 kubelet[2569]: E0213 08:16:40.737027 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:16:40.737189 kubelet[2569]: E0213 08:16:40.737066 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:16:40.737189 kubelet[2569]: E0213 08:16:40.737096 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:40.737189 kubelet[2569]: E0213 08:16:40.737110 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:16:40.737297 env[1458]: time="2024-02-13T08:16:40.736928728Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:41.325866 systemd[1]: Started sshd@120-145.40.90.207:22-139.178.68.195:58410.service. Feb 13 08:16:41.325000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-145.40.90.207:22-139.178.68.195:58410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:41.352671 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:16:41.352742 kernel: audit: type=1130 audit(1707812201.325:2361): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-145.40.90.207:22-139.178.68.195:58410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:41.465981 systemd[1]: Started sshd@121-145.40.90.207:22-101.36.65.131:18640.service. Feb 13 08:16:41.465000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@121-145.40.90.207:22-101.36.65.131:18640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:41.470543 sshd[17419]: Accepted publickey for core from 139.178.68.195 port 58410 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:41.471890 sshd[17419]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:41.474251 systemd-logind[1446]: New session 111 of user core. Feb 13 08:16:41.474985 systemd[1]: Started session-111.scope. Feb 13 08:16:41.469000 audit[17419]: USER_ACCT pid=17419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.558352 sshd[17419]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:41.559860 systemd[1]: sshd@120-145.40.90.207:22-139.178.68.195:58410.service: Deactivated successfully. Feb 13 08:16:41.560305 systemd[1]: session-111.scope: Deactivated successfully. Feb 13 08:16:41.560598 systemd-logind[1446]: Session 111 logged out. Waiting for processes to exit. Feb 13 08:16:41.561152 systemd-logind[1446]: Removed session 111. Feb 13 08:16:41.648230 kernel: audit: type=1130 audit(1707812201.465:2362): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@121-145.40.90.207:22-101.36.65.131:18640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:41.648318 kernel: audit: type=1101 audit(1707812201.469:2363): pid=17419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.648337 kernel: audit: type=1103 audit(1707812201.471:2364): pid=17419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.471000 audit[17419]: CRED_ACQ pid=17419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.738946 kernel: audit: type=1006 audit(1707812201.471:2365): pid=17419 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=111 res=1 Feb 13 08:16:41.471000 audit[17419]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5b944710 a2=3 a3=0 items=0 ppid=1 pid=17419 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=111 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:41.889903 kernel: audit: type=1300 audit(1707812201.471:2365): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5b944710 a2=3 a3=0 items=0 ppid=1 pid=17419 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=111 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:41.889938 kernel: audit: type=1327 audit(1707812201.471:2365): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:41.471000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:41.920407 kernel: audit: type=1105 audit(1707812201.476:2366): pid=17419 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.476000 audit[17419]: USER_START pid=17419 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:42.015925 kernel: audit: type=1103 audit(1707812201.477:2367): pid=17423 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.477000 audit[17423]: CRED_ACQ pid=17423 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:42.105309 kernel: audit: type=1106 audit(1707812201.558:2368): pid=17419 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.558000 audit[17419]: USER_END pid=17419 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.558000 audit[17419]: CRED_DISP pid=17419 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:41.559000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-145.40.90.207:22-139.178.68.195:58410 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:44.706335 env[1458]: time="2024-02-13T08:16:44.706234492Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:16:44.745644 env[1458]: time="2024-02-13T08:16:44.745575752Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:44.745873 kubelet[2569]: E0213 08:16:44.745823 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:16:44.745873 kubelet[2569]: E0213 08:16:44.745866 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:16:44.746278 kubelet[2569]: E0213 08:16:44.745910 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:44.746278 kubelet[2569]: E0213 08:16:44.745947 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:16:46.125000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:46.125000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000b22360 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:16:46.125000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:16:46.125000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:46.125000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c002720a50 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:16:46.125000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:16:46.194000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:46.194000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c014df4520 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:16:46.194000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:16:46.194000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:46.194000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c005cc6e70 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:16:46.194000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:16:46.194000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=524817 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:46.194000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00a4ee570 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:16:46.194000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:16:46.567004 systemd[1]: Started sshd@122-145.40.90.207:22-139.178.68.195:46486.service. Feb 13 08:16:46.566000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@122-145.40.90.207:22-139.178.68.195:46486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:46.594884 kernel: kauditd_printk_skb: 17 callbacks suppressed Feb 13 08:16:46.594961 kernel: audit: type=1130 audit(1707812206.566:2376): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@122-145.40.90.207:22-139.178.68.195:46486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:46.712000 audit[17475]: USER_ACCT pid=17475 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.712841 sshd[17475]: Accepted publickey for core from 139.178.68.195 port 46486 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:46.714945 sshd[17475]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:46.717358 systemd-logind[1446]: New session 112 of user core. Feb 13 08:16:46.717774 systemd[1]: Started session-112.scope. Feb 13 08:16:46.795791 sshd[17475]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:46.797232 systemd[1]: sshd@122-145.40.90.207:22-139.178.68.195:46486.service: Deactivated successfully. Feb 13 08:16:46.797670 systemd[1]: session-112.scope: Deactivated successfully. Feb 13 08:16:46.798017 systemd-logind[1446]: Session 112 logged out. Waiting for processes to exit. Feb 13 08:16:46.798488 systemd-logind[1446]: Removed session 112. Feb 13 08:16:46.714000 audit[17475]: CRED_ACQ pid=17475 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.896057 kernel: audit: type=1101 audit(1707812206.712:2377): pid=17475 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.896096 kernel: audit: type=1103 audit(1707812206.714:2378): pid=17475 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.896113 kernel: audit: type=1006 audit(1707812206.714:2379): pid=17475 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=112 res=1 Feb 13 08:16:46.714000 audit[17475]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb96a8c10 a2=3 a3=0 items=0 ppid=1 pid=17475 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=112 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:47.047999 kernel: audit: type=1300 audit(1707812206.714:2379): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb96a8c10 a2=3 a3=0 items=0 ppid=1 pid=17475 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=112 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:47.048101 kernel: audit: type=1327 audit(1707812206.714:2379): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:46.714000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:47.078697 kernel: audit: type=1105 audit(1707812206.719:2380): pid=17475 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.719000 audit[17475]: USER_START pid=17475 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:47.173972 kernel: audit: type=1103 audit(1707812206.719:2381): pid=17477 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.719000 audit[17477]: CRED_ACQ pid=17477 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:47.264030 kernel: audit: type=1106 audit(1707812206.795:2382): pid=17475 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.795000 audit[17475]: USER_END pid=17475 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.795000 audit[17475]: CRED_DISP pid=17475 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:47.450455 kernel: audit: type=1104 audit(1707812206.795:2383): pid=17475 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:46.796000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@122-145.40.90.207:22-139.178.68.195:46486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:46.947000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:46.947000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c00e509520 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:16:46.947000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:16:46.947000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=524821 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:46.947000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5c a1=c008db70e0 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:16:46.947000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:16:46.947000 audit[2394]: AVC avc: denied { watch } for pid=2394 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=524823 scontext=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:46.947000 audit[2394]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5d a1=c00812c990 a2=fc6 a3=0 items=0 ppid=2256 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c263,c264 key=(null) Feb 13 08:16:46.947000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E39302E323037002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 08:16:50.880000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:50.880000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000db5b60 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:16:50.880000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:16:50.880000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:50.880000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000c246e0 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:16:50.880000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:16:50.880000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:50.880000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0015a5040 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:16:50.880000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:16:50.885000 audit[2387]: AVC avc: denied { watch } for pid=2387 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:16:50.885000 audit[2387]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0010a0880 a2=fc6 a3=0 items=0 ppid=2242 pid=2387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c631,c817 key=(null) Feb 13 08:16:50.885000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:16:51.706035 env[1458]: time="2024-02-13T08:16:51.705915605Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:16:51.720809 env[1458]: time="2024-02-13T08:16:51.720729806Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:51.721616 kubelet[2569]: E0213 08:16:51.721598 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:16:51.721833 kubelet[2569]: E0213 08:16:51.721640 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:16:51.721833 kubelet[2569]: E0213 08:16:51.721676 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:51.721833 kubelet[2569]: E0213 08:16:51.721696 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:16:51.808034 systemd[1]: Started sshd@123-145.40.90.207:22-139.178.68.195:46502.service. Feb 13 08:16:51.808000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@123-145.40.90.207:22-139.178.68.195:46502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:51.836037 kernel: kauditd_printk_skb: 22 callbacks suppressed Feb 13 08:16:51.836098 kernel: audit: type=1130 audit(1707812211.808:2392): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@123-145.40.90.207:22-139.178.68.195:46502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:51.954000 audit[17530]: USER_ACCT pid=17530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:51.955607 sshd[17530]: Accepted publickey for core from 139.178.68.195 port 46502 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:51.957926 sshd[17530]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:51.960316 systemd-logind[1446]: New session 113 of user core. Feb 13 08:16:51.960804 systemd[1]: Started session-113.scope. Feb 13 08:16:52.039171 sshd[17530]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:52.040507 systemd[1]: sshd@123-145.40.90.207:22-139.178.68.195:46502.service: Deactivated successfully. Feb 13 08:16:52.040946 systemd[1]: session-113.scope: Deactivated successfully. Feb 13 08:16:52.041308 systemd-logind[1446]: Session 113 logged out. Waiting for processes to exit. Feb 13 08:16:52.041809 systemd-logind[1446]: Removed session 113. Feb 13 08:16:51.957000 audit[17530]: CRED_ACQ pid=17530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.138200 kernel: audit: type=1101 audit(1707812211.954:2393): pid=17530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.138246 kernel: audit: type=1103 audit(1707812211.957:2394): pid=17530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.138266 kernel: audit: type=1006 audit(1707812211.957:2395): pid=17530 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=113 res=1 Feb 13 08:16:52.197422 kernel: audit: type=1300 audit(1707812211.957:2395): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7c8b1700 a2=3 a3=0 items=0 ppid=1 pid=17530 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=113 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:51.957000 audit[17530]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7c8b1700 a2=3 a3=0 items=0 ppid=1 pid=17530 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=113 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:52.290197 kernel: audit: type=1327 audit(1707812211.957:2395): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:51.957000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:52.320890 kernel: audit: type=1105 audit(1707812211.962:2396): pid=17530 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:51.962000 audit[17530]: USER_START pid=17530 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.416142 kernel: audit: type=1103 audit(1707812211.962:2397): pid=17532 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:51.962000 audit[17532]: CRED_ACQ pid=17532 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.505532 kernel: audit: type=1106 audit(1707812212.039:2398): pid=17530 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.039000 audit[17530]: USER_END pid=17530 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.601316 kernel: audit: type=1104 audit(1707812212.039:2399): pid=17530 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.039000 audit[17530]: CRED_DISP pid=17530 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:52.039000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@123-145.40.90.207:22-139.178.68.195:46502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:53.707053 env[1458]: time="2024-02-13T08:16:53.706906888Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:16:53.722057 env[1458]: time="2024-02-13T08:16:53.721993200Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:53.722209 kubelet[2569]: E0213 08:16:53.722195 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:16:53.722413 kubelet[2569]: E0213 08:16:53.722227 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:16:53.722413 kubelet[2569]: E0213 08:16:53.722263 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:53.722413 kubelet[2569]: E0213 08:16:53.722291 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:16:54.706203 env[1458]: time="2024-02-13T08:16:54.706099961Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:16:54.735702 env[1458]: time="2024-02-13T08:16:54.735642029Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:54.736030 kubelet[2569]: E0213 08:16:54.735945 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:16:54.736030 kubelet[2569]: E0213 08:16:54.735971 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:16:54.736030 kubelet[2569]: E0213 08:16:54.735991 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:54.736030 kubelet[2569]: E0213 08:16:54.736011 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:16:55.706167 env[1458]: time="2024-02-13T08:16:55.706033612Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:16:55.720840 env[1458]: time="2024-02-13T08:16:55.720802217Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:16:55.721020 kubelet[2569]: E0213 08:16:55.720976 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:16:55.721020 kubelet[2569]: E0213 08:16:55.721002 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:16:55.721100 kubelet[2569]: E0213 08:16:55.721025 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:16:55.721100 kubelet[2569]: E0213 08:16:55.721044 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:16:57.048161 systemd[1]: Started sshd@124-145.40.90.207:22-139.178.68.195:34088.service. Feb 13 08:16:57.047000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@124-145.40.90.207:22-139.178.68.195:34088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:57.075349 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:16:57.075427 kernel: audit: type=1130 audit(1707812217.047:2401): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@124-145.40.90.207:22-139.178.68.195:34088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:57.193000 audit[17640]: USER_ACCT pid=17640 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.193892 sshd[17640]: Accepted publickey for core from 139.178.68.195 port 34088 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:16:57.195931 sshd[17640]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:16:57.198332 systemd-logind[1446]: New session 114 of user core. Feb 13 08:16:57.198860 systemd[1]: Started session-114.scope. Feb 13 08:16:57.275819 sshd[17640]: pam_unix(sshd:session): session closed for user core Feb 13 08:16:57.277218 systemd[1]: sshd@124-145.40.90.207:22-139.178.68.195:34088.service: Deactivated successfully. Feb 13 08:16:57.277668 systemd[1]: session-114.scope: Deactivated successfully. Feb 13 08:16:57.278040 systemd-logind[1446]: Session 114 logged out. Waiting for processes to exit. Feb 13 08:16:57.278469 systemd-logind[1446]: Removed session 114. Feb 13 08:16:57.195000 audit[17640]: CRED_ACQ pid=17640 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.376839 kernel: audit: type=1101 audit(1707812217.193:2402): pid=17640 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.376879 kernel: audit: type=1103 audit(1707812217.195:2403): pid=17640 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.376898 kernel: audit: type=1006 audit(1707812217.195:2404): pid=17640 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=114 res=1 Feb 13 08:16:57.435628 kernel: audit: type=1300 audit(1707812217.195:2404): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdae33c810 a2=3 a3=0 items=0 ppid=1 pid=17640 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=114 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:57.195000 audit[17640]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdae33c810 a2=3 a3=0 items=0 ppid=1 pid=17640 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=114 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:16:57.527785 kernel: audit: type=1327 audit(1707812217.195:2404): proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:57.195000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:16:57.558256 kernel: audit: type=1105 audit(1707812217.200:2405): pid=17640 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.200000 audit[17640]: USER_START pid=17640 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.652867 kernel: audit: type=1103 audit(1707812217.200:2406): pid=17642 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.200000 audit[17642]: CRED_ACQ pid=17642 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.742205 kernel: audit: type=1106 audit(1707812217.275:2407): pid=17640 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.275000 audit[17640]: USER_END pid=17640 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.837795 kernel: audit: type=1104 audit(1707812217.275:2408): pid=17640 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.275000 audit[17640]: CRED_DISP pid=17640 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:16:57.276000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@124-145.40.90.207:22-139.178.68.195:34088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:01.165429 systemd[1]: Started sshd@125-145.40.90.207:22-113.31.105.94:41508.service. Feb 13 08:17:01.164000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@125-145.40.90.207:22-113.31.105.94:41508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:02.285572 systemd[1]: Started sshd@126-145.40.90.207:22-139.178.68.195:34092.service. Feb 13 08:17:02.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@126-145.40.90.207:22-139.178.68.195:34092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:02.312618 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:17:02.312692 kernel: audit: type=1130 audit(1707812222.285:2411): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@126-145.40.90.207:22-139.178.68.195:34092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:02.430000 audit[17668]: USER_ACCT pid=17668 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.430757 sshd[17668]: Accepted publickey for core from 139.178.68.195 port 34092 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:02.431851 sshd[17668]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:02.434149 systemd-logind[1446]: New session 115 of user core. Feb 13 08:17:02.434825 systemd[1]: Started session-115.scope. Feb 13 08:17:02.514008 sshd[17668]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:02.515244 systemd[1]: sshd@126-145.40.90.207:22-139.178.68.195:34092.service: Deactivated successfully. Feb 13 08:17:02.515753 systemd[1]: session-115.scope: Deactivated successfully. Feb 13 08:17:02.516114 systemd-logind[1446]: Session 115 logged out. Waiting for processes to exit. Feb 13 08:17:02.516516 systemd-logind[1446]: Removed session 115. Feb 13 08:17:02.431000 audit[17668]: CRED_ACQ pid=17668 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.525711 kernel: audit: type=1101 audit(1707812222.430:2412): pid=17668 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.525785 kernel: audit: type=1103 audit(1707812222.431:2413): pid=17668 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.674226 kernel: audit: type=1006 audit(1707812222.431:2414): pid=17668 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=115 res=1 Feb 13 08:17:02.674260 kernel: audit: type=1300 audit(1707812222.431:2414): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce85e5d30 a2=3 a3=0 items=0 ppid=1 pid=17668 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=115 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:02.431000 audit[17668]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce85e5d30 a2=3 a3=0 items=0 ppid=1 pid=17668 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=115 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:02.681965 sshd[17665]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.31.105.94 user=root Feb 13 08:17:02.766471 kernel: audit: type=1327 audit(1707812222.431:2414): proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:02.431000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:02.797025 kernel: audit: type=1105 audit(1707812222.436:2415): pid=17668 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.436000 audit[17668]: USER_START pid=17668 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.891712 kernel: audit: type=1103 audit(1707812222.436:2416): pid=17670 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.436000 audit[17670]: CRED_ACQ pid=17670 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.513000 audit[17668]: USER_END pid=17668 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:03.076883 kernel: audit: type=1106 audit(1707812222.513:2417): pid=17668 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:03.076918 kernel: audit: type=1104 audit(1707812222.513:2418): pid=17668 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.513000 audit[17668]: CRED_DISP pid=17668 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:02.514000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@126-145.40.90.207:22-139.178.68.195:34092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:02.681000 audit[17665]: USER_AUTH pid=17665 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=113.31.105.94 addr=113.31.105.94 terminal=ssh res=failed' Feb 13 08:17:03.538334 systemd[1]: Started sshd@127-145.40.90.207:22-43.153.220.201:36614.service. Feb 13 08:17:03.537000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@127-145.40.90.207:22-43.153.220.201:36614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.543099 sshd[17691]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.220.201 user=root Feb 13 08:17:04.542000 audit[17691]: USER_AUTH pid=17691 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.220.201 addr=43.153.220.201 terminal=ssh res=failed' Feb 13 08:17:04.693165 sshd[17665]: Failed password for root from 113.31.105.94 port 41508 ssh2 Feb 13 08:17:06.260154 sshd[17665]: Received disconnect from 113.31.105.94 port 41508:11: Bye Bye [preauth] Feb 13 08:17:06.260154 sshd[17665]: Disconnected from authenticating user root 113.31.105.94 port 41508 [preauth] Feb 13 08:17:06.261209 systemd[1]: sshd@125-145.40.90.207:22-113.31.105.94:41508.service: Deactivated successfully. Feb 13 08:17:06.260000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@125-145.40.90.207:22-113.31.105.94:41508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:06.706461 env[1458]: time="2024-02-13T08:17:06.706330935Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:17:06.706461 env[1458]: time="2024-02-13T08:17:06.706385538Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:17:06.734707 env[1458]: time="2024-02-13T08:17:06.734619840Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:06.734914 kubelet[2569]: E0213 08:17:06.734888 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:17:06.735185 kubelet[2569]: E0213 08:17:06.734919 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:17:06.735185 kubelet[2569]: E0213 08:17:06.734943 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:06.735185 kubelet[2569]: E0213 08:17:06.734962 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:17:06.735185 kubelet[2569]: E0213 08:17:06.735182 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:17:06.735339 env[1458]: time="2024-02-13T08:17:06.735034546Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:06.735363 kubelet[2569]: E0213 08:17:06.735212 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:17:06.735363 kubelet[2569]: E0213 08:17:06.735239 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:06.735363 kubelet[2569]: E0213 08:17:06.735253 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:17:06.830309 sshd[17691]: Failed password for root from 43.153.220.201 port 36614 ssh2 Feb 13 08:17:07.523865 systemd[1]: Started sshd@128-145.40.90.207:22-139.178.68.195:53012.service. Feb 13 08:17:07.523000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@128-145.40.90.207:22-139.178.68.195:53012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:07.550806 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:17:07.550892 kernel: audit: type=1130 audit(1707812227.523:2424): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@128-145.40.90.207:22-139.178.68.195:53012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:07.667000 audit[17758]: USER_ACCT pid=17758 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.667815 sshd[17758]: Accepted publickey for core from 139.178.68.195 port 53012 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:07.670899 sshd[17758]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:07.673302 systemd-logind[1446]: New session 116 of user core. Feb 13 08:17:07.673809 systemd[1]: Started session-116.scope. Feb 13 08:17:07.706213 env[1458]: time="2024-02-13T08:17:07.706119029Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:17:07.721395 env[1458]: time="2024-02-13T08:17:07.721320707Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:07.721620 kubelet[2569]: E0213 08:17:07.721539 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:17:07.721620 kubelet[2569]: E0213 08:17:07.721571 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:17:07.721620 kubelet[2569]: E0213 08:17:07.721602 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:07.721749 kubelet[2569]: E0213 08:17:07.721625 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:17:07.754374 sshd[17758]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:07.755855 systemd[1]: sshd@128-145.40.90.207:22-139.178.68.195:53012.service: Deactivated successfully. Feb 13 08:17:07.756274 systemd[1]: session-116.scope: Deactivated successfully. Feb 13 08:17:07.756550 systemd-logind[1446]: Session 116 logged out. Waiting for processes to exit. Feb 13 08:17:07.757170 systemd-logind[1446]: Removed session 116. Feb 13 08:17:07.670000 audit[17758]: CRED_ACQ pid=17758 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.852254 kernel: audit: type=1101 audit(1707812227.667:2425): pid=17758 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.852301 kernel: audit: type=1103 audit(1707812227.670:2426): pid=17758 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.852323 kernel: audit: type=1006 audit(1707812227.670:2427): pid=17758 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=116 res=1 Feb 13 08:17:07.911057 kernel: audit: type=1300 audit(1707812227.670:2427): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff642dd9e0 a2=3 a3=0 items=0 ppid=1 pid=17758 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=116 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:07.670000 audit[17758]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff642dd9e0 a2=3 a3=0 items=0 ppid=1 pid=17758 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=116 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:08.003280 kernel: audit: type=1327 audit(1707812227.670:2427): proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:07.670000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:08.033784 kernel: audit: type=1105 audit(1707812227.675:2428): pid=17758 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.675000 audit[17758]: USER_START pid=17758 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:08.128422 kernel: audit: type=1103 audit(1707812227.676:2429): pid=17760 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.676000 audit[17760]: CRED_ACQ pid=17760 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:08.160860 sshd[17691]: Received disconnect from 43.153.220.201 port 36614:11: Bye Bye [preauth] Feb 13 08:17:08.160860 sshd[17691]: Disconnected from authenticating user root 43.153.220.201 port 36614 [preauth] Feb 13 08:17:08.161449 systemd[1]: sshd@127-145.40.90.207:22-43.153.220.201:36614.service: Deactivated successfully. Feb 13 08:17:08.217842 kernel: audit: type=1106 audit(1707812227.754:2430): pid=17758 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.754000 audit[17758]: USER_END pid=17758 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:08.313496 kernel: audit: type=1104 audit(1707812227.754:2431): pid=17758 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.754000 audit[17758]: CRED_DISP pid=17758 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:07.755000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@128-145.40.90.207:22-139.178.68.195:53012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:08.160000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@127-145.40.90.207:22-43.153.220.201:36614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:08.706662 env[1458]: time="2024-02-13T08:17:08.706496108Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:17:08.758777 env[1458]: time="2024-02-13T08:17:08.758691722Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:08.759181 kubelet[2569]: E0213 08:17:08.758968 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:17:08.759181 kubelet[2569]: E0213 08:17:08.759012 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:17:08.759181 kubelet[2569]: E0213 08:17:08.759058 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:08.759181 kubelet[2569]: E0213 08:17:08.759094 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:17:12.763368 systemd[1]: Started sshd@129-145.40.90.207:22-139.178.68.195:53014.service. Feb 13 08:17:12.762000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@129-145.40.90.207:22-139.178.68.195:53014 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:12.790325 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:17:12.790429 kernel: audit: type=1130 audit(1707812232.762:2434): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@129-145.40.90.207:22-139.178.68.195:53014 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:12.908000 audit[17843]: USER_ACCT pid=17843 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:12.908949 sshd[17843]: Accepted publickey for core from 139.178.68.195 port 53014 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:12.910912 sshd[17843]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:12.913292 systemd-logind[1446]: New session 117 of user core. Feb 13 08:17:12.913775 systemd[1]: Started session-117.scope. Feb 13 08:17:12.992726 sshd[17843]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:12.994177 systemd[1]: sshd@129-145.40.90.207:22-139.178.68.195:53014.service: Deactivated successfully. Feb 13 08:17:12.994611 systemd[1]: session-117.scope: Deactivated successfully. Feb 13 08:17:12.995007 systemd-logind[1446]: Session 117 logged out. Waiting for processes to exit. Feb 13 08:17:12.995480 systemd-logind[1446]: Removed session 117. Feb 13 08:17:12.910000 audit[17843]: CRED_ACQ pid=17843 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:13.091398 kernel: audit: type=1101 audit(1707812232.908:2435): pid=17843 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:13.091444 kernel: audit: type=1103 audit(1707812232.910:2436): pid=17843 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:13.091464 kernel: audit: type=1006 audit(1707812232.910:2437): pid=17843 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=117 res=1 Feb 13 08:17:13.150226 kernel: audit: type=1300 audit(1707812232.910:2437): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd5ea23ed0 a2=3 a3=0 items=0 ppid=1 pid=17843 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=117 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:12.910000 audit[17843]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd5ea23ed0 a2=3 a3=0 items=0 ppid=1 pid=17843 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=117 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:13.242452 kernel: audit: type=1327 audit(1707812232.910:2437): proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:12.910000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:13.272954 kernel: audit: type=1105 audit(1707812232.915:2438): pid=17843 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:12.915000 audit[17843]: USER_START pid=17843 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:13.367626 kernel: audit: type=1103 audit(1707812232.915:2439): pid=17845 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:12.915000 audit[17845]: CRED_ACQ pid=17845 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:13.457030 kernel: audit: type=1106 audit(1707812232.992:2440): pid=17843 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:12.992000 audit[17843]: USER_END pid=17843 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:13.552712 kernel: audit: type=1104 audit(1707812232.992:2441): pid=17843 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:12.992000 audit[17843]: CRED_DISP pid=17843 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:12.993000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@129-145.40.90.207:22-139.178.68.195:53014 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:18.002409 systemd[1]: Started sshd@130-145.40.90.207:22-139.178.68.195:52232.service. Feb 13 08:17:18.001000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@130-145.40.90.207:22-139.178.68.195:52232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:18.029170 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:17:18.029270 kernel: audit: type=1130 audit(1707812238.001:2443): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@130-145.40.90.207:22-139.178.68.195:52232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:18.146000 audit[17867]: USER_ACCT pid=17867 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.147100 sshd[17867]: Accepted publickey for core from 139.178.68.195 port 52232 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:18.148843 sshd[17867]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:18.151818 systemd-logind[1446]: New session 118 of user core. Feb 13 08:17:18.152534 systemd[1]: Started session-118.scope. Feb 13 08:17:18.232344 sshd[17867]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:18.233744 systemd[1]: sshd@130-145.40.90.207:22-139.178.68.195:52232.service: Deactivated successfully. Feb 13 08:17:18.234209 systemd[1]: session-118.scope: Deactivated successfully. Feb 13 08:17:18.234565 systemd-logind[1446]: Session 118 logged out. Waiting for processes to exit. Feb 13 08:17:18.235232 systemd-logind[1446]: Removed session 118. Feb 13 08:17:18.147000 audit[17867]: CRED_ACQ pid=17867 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.331168 kernel: audit: type=1101 audit(1707812238.146:2444): pid=17867 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.331209 kernel: audit: type=1103 audit(1707812238.147:2445): pid=17867 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.331231 kernel: audit: type=1006 audit(1707812238.147:2446): pid=17867 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=118 res=1 Feb 13 08:17:18.390017 kernel: audit: type=1300 audit(1707812238.147:2446): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2966b190 a2=3 a3=0 items=0 ppid=1 pid=17867 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=118 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:18.147000 audit[17867]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2966b190 a2=3 a3=0 items=0 ppid=1 pid=17867 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=118 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:18.482311 kernel: audit: type=1327 audit(1707812238.147:2446): proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:18.147000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:18.512838 kernel: audit: type=1105 audit(1707812238.154:2447): pid=17867 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.154000 audit[17867]: USER_START pid=17867 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.607607 kernel: audit: type=1103 audit(1707812238.155:2448): pid=17869 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.155000 audit[17869]: CRED_ACQ pid=17869 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.663737 systemd[1]: Started sshd@131-145.40.90.207:22-202.188.109.48:40088.service. Feb 13 08:17:18.232000 audit[17867]: USER_END pid=17867 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.705085 env[1458]: time="2024-02-13T08:17:18.705068768Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:17:18.705085 env[1458]: time="2024-02-13T08:17:18.705066713Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:17:18.717515 env[1458]: time="2024-02-13T08:17:18.717444800Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:18.717515 env[1458]: time="2024-02-13T08:17:18.717500693Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:18.717687 kubelet[2569]: E0213 08:17:18.717640 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:17:18.717687 kubelet[2569]: E0213 08:17:18.717672 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:17:18.717875 kubelet[2569]: E0213 08:17:18.717694 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:18.717875 kubelet[2569]: E0213 08:17:18.717640 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:17:18.717875 kubelet[2569]: E0213 08:17:18.717713 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:17:18.717875 kubelet[2569]: E0213 08:17:18.717725 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:17:18.717996 kubelet[2569]: E0213 08:17:18.717745 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:18.717996 kubelet[2569]: E0213 08:17:18.717759 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:17:18.792860 kernel: audit: type=1106 audit(1707812238.232:2449): pid=17867 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.792899 kernel: audit: type=1104 audit(1707812238.232:2450): pid=17867 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.232000 audit[17867]: CRED_DISP pid=17867 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:18.233000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@130-145.40.90.207:22-139.178.68.195:52232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:18.663000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@131-145.40.90.207:22-202.188.109.48:40088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:19.706581 env[1458]: time="2024-02-13T08:17:19.706479177Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:17:19.757376 env[1458]: time="2024-02-13T08:17:19.757282387Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:19.757591 kubelet[2569]: E0213 08:17:19.757567 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:17:19.757966 kubelet[2569]: E0213 08:17:19.757616 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:17:19.757966 kubelet[2569]: E0213 08:17:19.757685 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:19.757966 kubelet[2569]: E0213 08:17:19.757725 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:17:19.834883 sshd[17891]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=202.188.109.48 user=root Feb 13 08:17:19.834000 audit[17891]: ANOM_LOGIN_FAILURES pid=17891 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:19.834000 audit[17891]: USER_AUTH pid=17891 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=202.188.109.48 addr=202.188.109.48 terminal=ssh res=failed' Feb 13 08:17:19.835111 sshd[17891]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:17:20.706289 env[1458]: time="2024-02-13T08:17:20.706196587Z" level=info msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\"" Feb 13 08:17:20.756978 env[1458]: time="2024-02-13T08:17:20.756882748Z" level=error msg="StopPodSandbox for \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\" failed" error="failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:20.757542 kubelet[2569]: E0213 08:17:20.757190 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b" Feb 13 08:17:20.757542 kubelet[2569]: E0213 08:17:20.757252 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b} Feb 13 08:17:20.757542 kubelet[2569]: E0213 08:17:20.757336 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:20.757542 kubelet[2569]: E0213 08:17:20.757406 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18082ea3-5d5e-4eed-963b-be8271107e06\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d49648fd683d75f5f04f0d4ef5c083a2398eb1787e505772048e4a7b4002e3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-7xbl5" podUID=18082ea3-5d5e-4eed-963b-be8271107e06 Feb 13 08:17:21.178888 sshd[17891]: Failed password for root from 202.188.109.48 port 40088 ssh2 Feb 13 08:17:21.767603 sshd[17891]: Received disconnect from 202.188.109.48 port 40088:11: Bye Bye [preauth] Feb 13 08:17:21.767603 sshd[17891]: Disconnected from authenticating user root 202.188.109.48 port 40088 [preauth] Feb 13 08:17:21.770140 systemd[1]: sshd@131-145.40.90.207:22-202.188.109.48:40088.service: Deactivated successfully. Feb 13 08:17:21.770000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@131-145.40.90.207:22-202.188.109.48:40088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:23.236132 systemd[1]: Started sshd@132-145.40.90.207:22-139.178.68.195:52244.service. Feb 13 08:17:23.235000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@132-145.40.90.207:22-139.178.68.195:52244 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:23.263144 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:17:23.263223 kernel: audit: type=1130 audit(1707812243.235:2456): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@132-145.40.90.207:22-139.178.68.195:52244 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:23.380000 audit[18011]: USER_ACCT pid=18011 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.381574 sshd[18011]: Accepted publickey for core from 139.178.68.195 port 52244 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:23.382740 sshd[18011]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:23.384974 systemd-logind[1446]: New session 119 of user core. Feb 13 08:17:23.385409 systemd[1]: Started session-119.scope. Feb 13 08:17:23.463947 sshd[18011]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:23.465317 systemd[1]: sshd@132-145.40.90.207:22-139.178.68.195:52244.service: Deactivated successfully. Feb 13 08:17:23.465747 systemd[1]: session-119.scope: Deactivated successfully. Feb 13 08:17:23.466069 systemd-logind[1446]: Session 119 logged out. Waiting for processes to exit. Feb 13 08:17:23.466513 systemd-logind[1446]: Removed session 119. Feb 13 08:17:23.381000 audit[18011]: CRED_ACQ pid=18011 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.563886 kernel: audit: type=1101 audit(1707812243.380:2457): pid=18011 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.563930 kernel: audit: type=1103 audit(1707812243.381:2458): pid=18011 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.563948 kernel: audit: type=1006 audit(1707812243.381:2459): pid=18011 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=119 res=1 Feb 13 08:17:23.622761 kernel: audit: type=1300 audit(1707812243.381:2459): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd2df53f30 a2=3 a3=0 items=0 ppid=1 pid=18011 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=119 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:23.381000 audit[18011]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd2df53f30 a2=3 a3=0 items=0 ppid=1 pid=18011 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=119 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:23.715033 kernel: audit: type=1327 audit(1707812243.381:2459): proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:23.381000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:23.745662 kernel: audit: type=1105 audit(1707812243.386:2460): pid=18011 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.386000 audit[18011]: USER_START pid=18011 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.840357 kernel: audit: type=1103 audit(1707812243.387:2461): pid=18013 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.387000 audit[18013]: CRED_ACQ pid=18013 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.929817 kernel: audit: type=1106 audit(1707812243.463:2462): pid=18011 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.463000 audit[18011]: USER_END pid=18011 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:24.025552 kernel: audit: type=1104 audit(1707812243.464:2463): pid=18011 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.464000 audit[18011]: CRED_DISP pid=18011 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:23.464000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@132-145.40.90.207:22-139.178.68.195:52244 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:28.473526 systemd[1]: Started sshd@133-145.40.90.207:22-139.178.68.195:60962.service. Feb 13 08:17:28.472000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@133-145.40.90.207:22-139.178.68.195:60962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:28.500517 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:17:28.500570 kernel: audit: type=1130 audit(1707812248.472:2465): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@133-145.40.90.207:22-139.178.68.195:60962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:28.618000 audit[18036]: USER_ACCT pid=18036 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.619047 sshd[18036]: Accepted publickey for core from 139.178.68.195 port 60962 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:28.620929 sshd[18036]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:28.623275 systemd-logind[1446]: New session 120 of user core. Feb 13 08:17:28.623770 systemd[1]: Started session-120.scope. Feb 13 08:17:28.701132 sshd[18036]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:28.702577 systemd[1]: sshd@133-145.40.90.207:22-139.178.68.195:60962.service: Deactivated successfully. Feb 13 08:17:28.703027 systemd[1]: session-120.scope: Deactivated successfully. Feb 13 08:17:28.703379 systemd-logind[1446]: Session 120 logged out. Waiting for processes to exit. Feb 13 08:17:28.703730 systemd-logind[1446]: Removed session 120. Feb 13 08:17:28.620000 audit[18036]: CRED_ACQ pid=18036 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.801377 kernel: audit: type=1101 audit(1707812248.618:2466): pid=18036 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.801417 kernel: audit: type=1103 audit(1707812248.620:2467): pid=18036 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.801438 kernel: audit: type=1006 audit(1707812248.620:2468): pid=18036 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=120 res=1 Feb 13 08:17:28.860217 kernel: audit: type=1300 audit(1707812248.620:2468): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe59d03260 a2=3 a3=0 items=0 ppid=1 pid=18036 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=120 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:28.620000 audit[18036]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe59d03260 a2=3 a3=0 items=0 ppid=1 pid=18036 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=120 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:28.952491 kernel: audit: type=1327 audit(1707812248.620:2468): proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:28.620000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:28.983018 kernel: audit: type=1105 audit(1707812248.625:2469): pid=18036 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.625000 audit[18036]: USER_START pid=18036 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:29.077714 kernel: audit: type=1103 audit(1707812248.626:2470): pid=18038 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.626000 audit[18038]: CRED_ACQ pid=18038 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:29.167109 kernel: audit: type=1106 audit(1707812248.701:2471): pid=18036 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.701000 audit[18036]: USER_END pid=18036 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.701000 audit[18036]: CRED_DISP pid=18036 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:29.352268 kernel: audit: type=1104 audit(1707812248.701:2472): pid=18036 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:28.702000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@133-145.40.90.207:22-139.178.68.195:60962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:30.707027 env[1458]: time="2024-02-13T08:17:30.706888601Z" level=info msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\"" Feb 13 08:17:30.735942 env[1458]: time="2024-02-13T08:17:30.735870887Z" level=error msg="StopPodSandbox for \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\" failed" error="failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:30.736118 kubelet[2569]: E0213 08:17:30.736076 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e" Feb 13 08:17:30.736118 kubelet[2569]: E0213 08:17:30.736100 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e} Feb 13 08:17:30.736310 kubelet[2569]: E0213 08:17:30.736122 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:30.736310 kubelet[2569]: E0213 08:17:30.736139 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d3077153-a5bc-4449-ba4f-3a1b2983528b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1a7bd63da9831f5c4a379b0ef2974dc032cb363d4b1174653e25ff03f3e7567e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8djc9" podUID=d3077153-a5bc-4449-ba4f-3a1b2983528b Feb 13 08:17:31.706667 env[1458]: time="2024-02-13T08:17:31.706501954Z" level=info msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\"" Feb 13 08:17:31.721706 env[1458]: time="2024-02-13T08:17:31.721628213Z" level=error msg="StopPodSandbox for \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\" failed" error="failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:31.721928 kubelet[2569]: E0213 08:17:31.721828 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100" Feb 13 08:17:31.721928 kubelet[2569]: E0213 08:17:31.721858 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100} Feb 13 08:17:31.721928 kubelet[2569]: E0213 08:17:31.721880 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:31.721928 kubelet[2569]: E0213 08:17:31.721903 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ec457c2-fc28-4626-897d-f9a56a1fa755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eb78bff43b62ee436121cef30811f8c11f2537cf3beba55a3bf4e172604c100\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5d78c9869d-qrnjl" podUID=9ec457c2-fc28-4626-897d-f9a56a1fa755 Feb 13 08:17:33.706025 env[1458]: time="2024-02-13T08:17:33.705938971Z" level=info msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\"" Feb 13 08:17:33.708775 systemd[1]: Started sshd@134-145.40.90.207:22-139.178.68.195:60970.service. Feb 13 08:17:33.708000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@134-145.40.90.207:22-139.178.68.195:60970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:33.725798 env[1458]: time="2024-02-13T08:17:33.725670624Z" level=error msg="StopPodSandbox for \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\" failed" error="failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:17:33.726026 kubelet[2569]: E0213 08:17:33.725963 2569 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988" Feb 13 08:17:33.726189 kubelet[2569]: E0213 08:17:33.726034 2569 kuberuntime_manager.go:1312] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988} Feb 13 08:17:33.726189 kubelet[2569]: E0213 08:17:33.726062 2569 kuberuntime_manager.go:1038] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:17:33.726189 kubelet[2569]: E0213 08:17:33.726085 2569 pod_workers.go:1294] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59431b79-ecac-4529-9083-2bad55873c23\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce02c9de795c78bfe0b1a29679c4393742771d8759d9f9d7aee8956c9a084988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-846b88998b-4vbpv" podUID=59431b79-ecac-4529-9083-2bad55873c23 Feb 13 08:17:33.736281 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:17:33.736377 kernel: audit: type=1130 audit(1707812253.708:2474): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@134-145.40.90.207:22-139.178.68.195:60970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:33.851000 audit[18121]: USER_ACCT pid=18121 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:33.852620 sshd[18121]: Accepted publickey for core from 139.178.68.195 port 60970 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:33.853947 sshd[18121]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:33.856327 systemd-logind[1446]: New session 121 of user core. Feb 13 08:17:33.856931 systemd[1]: Started session-121.scope. Feb 13 08:17:33.936220 sshd[18121]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:33.937701 systemd[1]: sshd@134-145.40.90.207:22-139.178.68.195:60970.service: Deactivated successfully. Feb 13 08:17:33.938144 systemd[1]: session-121.scope: Deactivated successfully. Feb 13 08:17:33.938433 systemd-logind[1446]: Session 121 logged out. Waiting for processes to exit. Feb 13 08:17:33.938785 systemd-logind[1446]: Removed session 121. Feb 13 08:17:33.853000 audit[18121]: CRED_ACQ pid=18121 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:34.034711 kernel: audit: type=1101 audit(1707812253.851:2475): pid=18121 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:34.034753 kernel: audit: type=1103 audit(1707812253.853:2476): pid=18121 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:34.034772 kernel: audit: type=1006 audit(1707812253.853:2477): pid=18121 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=121 res=1 Feb 13 08:17:34.093569 kernel: audit: type=1300 audit(1707812253.853:2477): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb24be950 a2=3 a3=0 items=0 ppid=1 pid=18121 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=121 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:33.853000 audit[18121]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb24be950 a2=3 a3=0 items=0 ppid=1 pid=18121 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=121 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:34.185819 kernel: audit: type=1327 audit(1707812253.853:2477): proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:33.853000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:34.216313 kernel: audit: type=1105 audit(1707812253.858:2478): pid=18121 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:33.858000 audit[18121]: USER_START pid=18121 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:34.311050 kernel: audit: type=1103 audit(1707812253.858:2479): pid=18149 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:33.858000 audit[18149]: CRED_ACQ pid=18149 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:34.400432 kernel: audit: type=1106 audit(1707812253.936:2480): pid=18121 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:33.936000 audit[18121]: USER_END pid=18121 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:34.496078 kernel: audit: type=1104 audit(1707812253.936:2481): pid=18121 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:33.936000 audit[18121]: CRED_DISP pid=18121 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:33.937000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@134-145.40.90.207:22-139.178.68.195:60970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'