Feb 13 09:51:59.549048 kernel: Linux version 5.15.148-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP Mon Feb 12 18:05:31 -00 2024 Feb 13 09:51:59.549060 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 09:51:59.549067 kernel: BIOS-provided physical RAM map: Feb 13 09:51:59.549071 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Feb 13 09:51:59.549075 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Feb 13 09:51:59.549079 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Feb 13 09:51:59.549083 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Feb 13 09:51:59.549087 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Feb 13 09:51:59.549091 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000081f88fff] usable Feb 13 09:51:59.549094 kernel: BIOS-e820: [mem 0x0000000081f89000-0x0000000081f89fff] ACPI NVS Feb 13 09:51:59.549099 kernel: BIOS-e820: [mem 0x0000000081f8a000-0x0000000081f8afff] reserved Feb 13 09:51:59.549103 kernel: BIOS-e820: [mem 0x0000000081f8b000-0x000000008afccfff] usable Feb 13 09:51:59.549107 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Feb 13 09:51:59.549110 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Feb 13 09:51:59.549115 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Feb 13 09:51:59.549120 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Feb 13 09:51:59.549125 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Feb 13 09:51:59.549129 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Feb 13 09:51:59.549133 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Feb 13 09:51:59.549137 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Feb 13 09:51:59.549141 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Feb 13 09:51:59.549145 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 09:51:59.549149 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Feb 13 09:51:59.549153 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Feb 13 09:51:59.549157 kernel: NX (Execute Disable) protection: active Feb 13 09:51:59.549161 kernel: SMBIOS 3.2.1 present. Feb 13 09:51:59.549167 kernel: DMI: Supermicro Super Server/X11SCM-F, BIOS 1.9 09/16/2022 Feb 13 09:51:59.549171 kernel: tsc: Detected 3400.000 MHz processor Feb 13 09:51:59.549175 kernel: tsc: Detected 3399.906 MHz TSC Feb 13 09:51:59.549179 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 09:51:59.549184 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 09:51:59.549188 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Feb 13 09:51:59.549192 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 09:51:59.549197 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Feb 13 09:51:59.549201 kernel: Using GB pages for direct mapping Feb 13 09:51:59.549205 kernel: ACPI: Early table checksum verification disabled Feb 13 09:51:59.549210 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Feb 13 09:51:59.549214 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Feb 13 09:51:59.549219 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Feb 13 09:51:59.549223 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Feb 13 09:51:59.549229 kernel: ACPI: FACS 0x000000008C66CF80 000040 Feb 13 09:51:59.549234 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Feb 13 09:51:59.549239 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Feb 13 09:51:59.549244 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Feb 13 09:51:59.549249 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Feb 13 09:51:59.549253 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Feb 13 09:51:59.549258 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Feb 13 09:51:59.549263 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Feb 13 09:51:59.549267 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Feb 13 09:51:59.549272 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 09:51:59.549278 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Feb 13 09:51:59.549282 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Feb 13 09:51:59.549287 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 09:51:59.549291 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 09:51:59.549296 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Feb 13 09:51:59.549301 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Feb 13 09:51:59.549305 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 09:51:59.549310 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Feb 13 09:51:59.549315 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Feb 13 09:51:59.549320 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Feb 13 09:51:59.549324 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Feb 13 09:51:59.549329 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Feb 13 09:51:59.549334 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Feb 13 09:51:59.549338 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Feb 13 09:51:59.549343 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Feb 13 09:51:59.549347 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Feb 13 09:51:59.549352 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Feb 13 09:51:59.549358 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Feb 13 09:51:59.549362 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Feb 13 09:51:59.549367 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Feb 13 09:51:59.549372 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Feb 13 09:51:59.549376 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Feb 13 09:51:59.549381 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Feb 13 09:51:59.549385 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Feb 13 09:51:59.549390 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Feb 13 09:51:59.549395 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Feb 13 09:51:59.549400 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Feb 13 09:51:59.549405 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Feb 13 09:51:59.549409 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Feb 13 09:51:59.549414 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Feb 13 09:51:59.549418 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Feb 13 09:51:59.549423 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Feb 13 09:51:59.549427 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Feb 13 09:51:59.549432 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Feb 13 09:51:59.549437 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Feb 13 09:51:59.549442 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Feb 13 09:51:59.549447 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Feb 13 09:51:59.549454 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Feb 13 09:51:59.549458 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Feb 13 09:51:59.549482 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Feb 13 09:51:59.549486 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Feb 13 09:51:59.549491 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Feb 13 09:51:59.549496 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Feb 13 09:51:59.549501 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Feb 13 09:51:59.549506 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Feb 13 09:51:59.549525 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Feb 13 09:51:59.549530 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Feb 13 09:51:59.549534 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Feb 13 09:51:59.549539 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Feb 13 09:51:59.549544 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Feb 13 09:51:59.549548 kernel: No NUMA configuration found Feb 13 09:51:59.549553 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Feb 13 09:51:59.549558 kernel: NODE_DATA(0) allocated [mem 0x86effa000-0x86effffff] Feb 13 09:51:59.549563 kernel: Zone ranges: Feb 13 09:51:59.549568 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 09:51:59.549572 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Feb 13 09:51:59.549577 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Feb 13 09:51:59.549581 kernel: Movable zone start for each node Feb 13 09:51:59.549586 kernel: Early memory node ranges Feb 13 09:51:59.549591 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Feb 13 09:51:59.549595 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Feb 13 09:51:59.549600 kernel: node 0: [mem 0x0000000040400000-0x0000000081f88fff] Feb 13 09:51:59.549605 kernel: node 0: [mem 0x0000000081f8b000-0x000000008afccfff] Feb 13 09:51:59.549610 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Feb 13 09:51:59.549614 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Feb 13 09:51:59.549619 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Feb 13 09:51:59.549624 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Feb 13 09:51:59.549628 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 09:51:59.549636 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Feb 13 09:51:59.549642 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Feb 13 09:51:59.549647 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Feb 13 09:51:59.549652 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Feb 13 09:51:59.549657 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Feb 13 09:51:59.549662 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Feb 13 09:51:59.549667 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Feb 13 09:51:59.549672 kernel: ACPI: PM-Timer IO Port: 0x1808 Feb 13 09:51:59.549677 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 09:51:59.549682 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 09:51:59.549687 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 09:51:59.549693 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 09:51:59.549697 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 09:51:59.549702 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 09:51:59.549707 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 09:51:59.549712 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 09:51:59.549717 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 09:51:59.549722 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 09:51:59.549727 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 09:51:59.549731 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 09:51:59.549737 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 09:51:59.549742 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 09:51:59.549747 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 09:51:59.549752 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 09:51:59.549756 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Feb 13 09:51:59.549761 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 13 09:51:59.549766 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 13 09:51:59.549771 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 09:51:59.549776 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Feb 13 09:51:59.549782 kernel: TSC deadline timer available Feb 13 09:51:59.549787 kernel: smpboot: Allowing 16 CPUs, 0 hotplug CPUs Feb 13 09:51:59.549792 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Feb 13 09:51:59.549797 kernel: Booting paravirtualized kernel on bare hardware Feb 13 09:51:59.549801 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 09:51:59.549806 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:512 nr_cpu_ids:16 nr_node_ids:1 Feb 13 09:51:59.549811 kernel: percpu: Embedded 55 pages/cpu s185624 r8192 d31464 u262144 Feb 13 09:51:59.549816 kernel: pcpu-alloc: s185624 r8192 d31464 u262144 alloc=1*2097152 Feb 13 09:51:59.549821 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Feb 13 09:51:59.549827 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8232415 Feb 13 09:51:59.549832 kernel: Policy zone: Normal Feb 13 09:51:59.549837 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 09:51:59.549842 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 09:51:59.549847 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Feb 13 09:51:59.549852 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Feb 13 09:51:59.549857 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 09:51:59.549863 kernel: Memory: 32724720K/33452980K available (12294K kernel code, 2275K rwdata, 13700K rodata, 45496K init, 4048K bss, 728000K reserved, 0K cma-reserved) Feb 13 09:51:59.549868 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Feb 13 09:51:59.549873 kernel: ftrace: allocating 34475 entries in 135 pages Feb 13 09:51:59.549878 kernel: ftrace: allocated 135 pages with 4 groups Feb 13 09:51:59.549883 kernel: rcu: Hierarchical RCU implementation. Feb 13 09:51:59.549888 kernel: rcu: RCU event tracing is enabled. Feb 13 09:51:59.549894 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Feb 13 09:51:59.549899 kernel: Rude variant of Tasks RCU enabled. Feb 13 09:51:59.549903 kernel: Tracing variant of Tasks RCU enabled. Feb 13 09:51:59.549909 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 09:51:59.549914 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Feb 13 09:51:59.549919 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Feb 13 09:51:59.549924 kernel: random: crng init done Feb 13 09:51:59.549929 kernel: Console: colour dummy device 80x25 Feb 13 09:51:59.549934 kernel: printk: console [tty0] enabled Feb 13 09:51:59.549939 kernel: printk: console [ttyS1] enabled Feb 13 09:51:59.549944 kernel: ACPI: Core revision 20210730 Feb 13 09:51:59.549949 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Feb 13 09:51:59.549954 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 09:51:59.549960 kernel: DMAR: Host address width 39 Feb 13 09:51:59.549965 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Feb 13 09:51:59.549970 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Feb 13 09:51:59.549975 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Feb 13 09:51:59.549980 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Feb 13 09:51:59.549984 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Feb 13 09:51:59.549989 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Feb 13 09:51:59.549994 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Feb 13 09:51:59.549999 kernel: x2apic enabled Feb 13 09:51:59.550005 kernel: Switched APIC routing to cluster x2apic. Feb 13 09:51:59.550010 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Feb 13 09:51:59.550015 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Feb 13 09:51:59.550020 kernel: CPU0: Thermal monitoring enabled (TM1) Feb 13 09:51:59.550025 kernel: process: using mwait in idle threads Feb 13 09:51:59.550029 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 09:51:59.550034 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 09:51:59.550039 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 09:51:59.550044 kernel: Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks! Feb 13 09:51:59.550050 kernel: Spectre V2 : Mitigation: Enhanced IBRS Feb 13 09:51:59.550055 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 09:51:59.550060 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 09:51:59.550064 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 09:51:59.550069 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 09:51:59.550074 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp Feb 13 09:51:59.550079 kernel: TAA: Mitigation: TSX disabled Feb 13 09:51:59.550084 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Feb 13 09:51:59.550089 kernel: SRBDS: Mitigation: Microcode Feb 13 09:51:59.550093 kernel: GDS: Vulnerable: No microcode Feb 13 09:51:59.550098 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 09:51:59.550104 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 09:51:59.550109 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 09:51:59.550114 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Feb 13 09:51:59.550119 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Feb 13 09:51:59.550123 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 09:51:59.550128 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Feb 13 09:51:59.550133 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Feb 13 09:51:59.550138 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Feb 13 09:51:59.550143 kernel: Freeing SMP alternatives memory: 32K Feb 13 09:51:59.550148 kernel: pid_max: default: 32768 minimum: 301 Feb 13 09:51:59.550152 kernel: LSM: Security Framework initializing Feb 13 09:51:59.550158 kernel: SELinux: Initializing. Feb 13 09:51:59.550163 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 13 09:51:59.550168 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 13 09:51:59.550173 kernel: smpboot: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Feb 13 09:51:59.550178 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 09:51:59.550182 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Feb 13 09:51:59.550187 kernel: ... version: 4 Feb 13 09:51:59.550192 kernel: ... bit width: 48 Feb 13 09:51:59.550197 kernel: ... generic registers: 4 Feb 13 09:51:59.550202 kernel: ... value mask: 0000ffffffffffff Feb 13 09:51:59.550208 kernel: ... max period: 00007fffffffffff Feb 13 09:51:59.550213 kernel: ... fixed-purpose events: 3 Feb 13 09:51:59.550218 kernel: ... event mask: 000000070000000f Feb 13 09:51:59.550222 kernel: signal: max sigframe size: 2032 Feb 13 09:51:59.550227 kernel: rcu: Hierarchical SRCU implementation. Feb 13 09:51:59.550232 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Feb 13 09:51:59.550237 kernel: smp: Bringing up secondary CPUs ... Feb 13 09:51:59.550242 kernel: x86: Booting SMP configuration: Feb 13 09:51:59.550247 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 Feb 13 09:51:59.550252 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Feb 13 09:51:59.550258 kernel: #9 #10 #11 #12 #13 #14 #15 Feb 13 09:51:59.550263 kernel: smp: Brought up 1 node, 16 CPUs Feb 13 09:51:59.550268 kernel: smpboot: Max logical packages: 1 Feb 13 09:51:59.550273 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Feb 13 09:51:59.550278 kernel: devtmpfs: initialized Feb 13 09:51:59.550283 kernel: x86/mm: Memory block size: 128MB Feb 13 09:51:59.550287 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x81f89000-0x81f89fff] (4096 bytes) Feb 13 09:51:59.550292 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Feb 13 09:51:59.550298 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 09:51:59.550303 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Feb 13 09:51:59.550308 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 09:51:59.550313 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 09:51:59.550318 kernel: audit: initializing netlink subsys (disabled) Feb 13 09:51:59.550323 kernel: audit: type=2000 audit(1707817914.040:1): state=initialized audit_enabled=0 res=1 Feb 13 09:51:59.550328 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 09:51:59.550333 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 09:51:59.550338 kernel: cpuidle: using governor menu Feb 13 09:51:59.550344 kernel: ACPI: bus type PCI registered Feb 13 09:51:59.550349 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 09:51:59.550353 kernel: dca service started, version 1.12.1 Feb 13 09:51:59.550358 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Feb 13 09:51:59.550363 kernel: PCI: MMCONFIG at [mem 0xe0000000-0xefffffff] reserved in E820 Feb 13 09:51:59.550368 kernel: PCI: Using configuration type 1 for base access Feb 13 09:51:59.550373 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Feb 13 09:51:59.550378 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 09:51:59.550383 kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 09:51:59.550389 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 09:51:59.550393 kernel: ACPI: Added _OSI(Module Device) Feb 13 09:51:59.550398 kernel: ACPI: Added _OSI(Processor Device) Feb 13 09:51:59.550403 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 09:51:59.550408 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 09:51:59.550413 kernel: ACPI: Added _OSI(Linux-Dell-Video) Feb 13 09:51:59.550418 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Feb 13 09:51:59.550423 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Feb 13 09:51:59.550428 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Feb 13 09:51:59.550434 kernel: ACPI: Dynamic OEM Table Load: Feb 13 09:51:59.550439 kernel: ACPI: SSDT 0xFFFF97DC40213500 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Feb 13 09:51:59.550444 kernel: ACPI: \_SB_.PR00: _OSC native thermal LVT Acked Feb 13 09:51:59.550449 kernel: ACPI: Dynamic OEM Table Load: Feb 13 09:51:59.550455 kernel: ACPI: SSDT 0xFFFF97DC41AE4C00 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Feb 13 09:51:59.550478 kernel: ACPI: Dynamic OEM Table Load: Feb 13 09:51:59.550483 kernel: ACPI: SSDT 0xFFFF97DC41A50800 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Feb 13 09:51:59.550488 kernel: ACPI: Dynamic OEM Table Load: Feb 13 09:51:59.550492 kernel: ACPI: SSDT 0xFFFF97DC41A52000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Feb 13 09:51:59.550497 kernel: ACPI: Dynamic OEM Table Load: Feb 13 09:51:59.550503 kernel: ACPI: SSDT 0xFFFF97DC4014E000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Feb 13 09:51:59.550508 kernel: ACPI: Dynamic OEM Table Load: Feb 13 09:51:59.550513 kernel: ACPI: SSDT 0xFFFF97DC41AE7800 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Feb 13 09:51:59.550518 kernel: ACPI: Interpreter enabled Feb 13 09:51:59.550523 kernel: ACPI: PM: (supports S0 S5) Feb 13 09:51:59.550528 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 09:51:59.550533 kernel: HEST: Enabling Firmware First mode for corrected errors. Feb 13 09:51:59.550539 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Feb 13 09:51:59.550543 kernel: HEST: Table parsing has been initialized. Feb 13 09:51:59.550549 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Feb 13 09:51:59.550554 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 09:51:59.550559 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Feb 13 09:51:59.550564 kernel: ACPI: PM: Power Resource [USBC] Feb 13 09:51:59.550569 kernel: ACPI: PM: Power Resource [V0PR] Feb 13 09:51:59.550574 kernel: ACPI: PM: Power Resource [V1PR] Feb 13 09:51:59.550579 kernel: ACPI: PM: Power Resource [V2PR] Feb 13 09:51:59.550584 kernel: ACPI: PM: Power Resource [WRST] Feb 13 09:51:59.550589 kernel: ACPI: PM: Power Resource [FN00] Feb 13 09:51:59.550595 kernel: ACPI: PM: Power Resource [FN01] Feb 13 09:51:59.550600 kernel: ACPI: PM: Power Resource [FN02] Feb 13 09:51:59.550605 kernel: ACPI: PM: Power Resource [FN03] Feb 13 09:51:59.550610 kernel: ACPI: PM: Power Resource [FN04] Feb 13 09:51:59.550615 kernel: ACPI: PM: Power Resource [PIN] Feb 13 09:51:59.550620 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Feb 13 09:51:59.550684 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 09:51:59.550730 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Feb 13 09:51:59.550773 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Feb 13 09:51:59.550780 kernel: PCI host bridge to bus 0000:00 Feb 13 09:51:59.550823 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 09:51:59.550860 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 13 09:51:59.550898 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 09:51:59.550935 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Feb 13 09:51:59.550970 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Feb 13 09:51:59.551009 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Feb 13 09:51:59.551058 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 Feb 13 09:51:59.551108 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 Feb 13 09:51:59.551152 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Feb 13 09:51:59.551197 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 Feb 13 09:51:59.551239 kernel: pci 0000:00:08.0: reg 0x10: [mem 0x9551f000-0x9551ffff 64bit] Feb 13 09:51:59.551286 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 Feb 13 09:51:59.551329 kernel: pci 0000:00:12.0: reg 0x10: [mem 0x9551e000-0x9551efff 64bit] Feb 13 09:51:59.551375 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 Feb 13 09:51:59.551418 kernel: pci 0000:00:14.0: reg 0x10: [mem 0x95500000-0x9550ffff 64bit] Feb 13 09:51:59.551463 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Feb 13 09:51:59.551510 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 Feb 13 09:51:59.551554 kernel: pci 0000:00:14.2: reg 0x10: [mem 0x95512000-0x95513fff 64bit] Feb 13 09:51:59.551594 kernel: pci 0000:00:14.2: reg 0x18: [mem 0x9551d000-0x9551dfff 64bit] Feb 13 09:51:59.551640 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 Feb 13 09:51:59.551681 kernel: pci 0000:00:15.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 09:51:59.551728 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 Feb 13 09:51:59.551769 kernel: pci 0000:00:15.1: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 09:51:59.551816 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 Feb 13 09:51:59.551857 kernel: pci 0000:00:16.0: reg 0x10: [mem 0x9551a000-0x9551afff 64bit] Feb 13 09:51:59.551899 kernel: pci 0000:00:16.0: PME# supported from D3hot Feb 13 09:51:59.551943 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 Feb 13 09:51:59.551984 kernel: pci 0000:00:16.1: reg 0x10: [mem 0x95519000-0x95519fff 64bit] Feb 13 09:51:59.552025 kernel: pci 0000:00:16.1: PME# supported from D3hot Feb 13 09:51:59.552071 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 Feb 13 09:51:59.552114 kernel: pci 0000:00:16.4: reg 0x10: [mem 0x95518000-0x95518fff 64bit] Feb 13 09:51:59.552155 kernel: pci 0000:00:16.4: PME# supported from D3hot Feb 13 09:51:59.552200 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 Feb 13 09:51:59.552242 kernel: pci 0000:00:17.0: reg 0x10: [mem 0x95510000-0x95511fff] Feb 13 09:51:59.552282 kernel: pci 0000:00:17.0: reg 0x14: [mem 0x95517000-0x955170ff] Feb 13 09:51:59.552323 kernel: pci 0000:00:17.0: reg 0x18: [io 0x6050-0x6057] Feb 13 09:51:59.552363 kernel: pci 0000:00:17.0: reg 0x1c: [io 0x6040-0x6043] Feb 13 09:51:59.552412 kernel: pci 0000:00:17.0: reg 0x20: [io 0x6020-0x603f] Feb 13 09:51:59.552456 kernel: pci 0000:00:17.0: reg 0x24: [mem 0x95516000-0x955167ff] Feb 13 09:51:59.552498 kernel: pci 0000:00:17.0: PME# supported from D3hot Feb 13 09:51:59.552543 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 Feb 13 09:51:59.552586 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Feb 13 09:51:59.552631 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 Feb 13 09:51:59.552673 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Feb 13 09:51:59.552723 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 Feb 13 09:51:59.552766 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Feb 13 09:51:59.552811 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 Feb 13 09:51:59.552853 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Feb 13 09:51:59.552900 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 Feb 13 09:51:59.552944 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Feb 13 09:51:59.552988 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 Feb 13 09:51:59.553030 kernel: pci 0000:00:1e.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 09:51:59.553077 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 Feb 13 09:51:59.553124 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 Feb 13 09:51:59.553165 kernel: pci 0000:00:1f.4: reg 0x10: [mem 0x95514000-0x955140ff 64bit] Feb 13 09:51:59.553207 kernel: pci 0000:00:1f.4: reg 0x20: [io 0xefa0-0xefbf] Feb 13 09:51:59.553253 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 Feb 13 09:51:59.553296 kernel: pci 0000:00:1f.5: reg 0x10: [mem 0xfe010000-0xfe010fff] Feb 13 09:51:59.553342 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 Feb 13 09:51:59.553388 kernel: pci 0000:01:00.0: reg 0x10: [mem 0x92000000-0x93ffffff 64bit pref] Feb 13 09:51:59.553431 kernel: pci 0000:01:00.0: reg 0x30: [mem 0x95200000-0x952fffff pref] Feb 13 09:51:59.553477 kernel: pci 0000:01:00.0: PME# supported from D3cold Feb 13 09:51:59.553521 kernel: pci 0000:01:00.0: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 13 09:51:59.553564 kernel: pci 0000:01:00.0: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 13 09:51:59.553611 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 Feb 13 09:51:59.553655 kernel: pci 0000:01:00.1: reg 0x10: [mem 0x90000000-0x91ffffff 64bit pref] Feb 13 09:51:59.553700 kernel: pci 0000:01:00.1: reg 0x30: [mem 0x95100000-0x951fffff pref] Feb 13 09:51:59.553742 kernel: pci 0000:01:00.1: PME# supported from D3cold Feb 13 09:51:59.553785 kernel: pci 0000:01:00.1: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 13 09:51:59.553827 kernel: pci 0000:01:00.1: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 13 09:51:59.553870 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 09:51:59.553911 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 13 09:51:59.553954 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 09:51:59.553996 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 13 09:51:59.554046 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 Feb 13 09:51:59.554090 kernel: pci 0000:03:00.0: reg 0x10: [mem 0x95400000-0x9547ffff] Feb 13 09:51:59.554132 kernel: pci 0000:03:00.0: reg 0x18: [io 0x5000-0x501f] Feb 13 09:51:59.554175 kernel: pci 0000:03:00.0: reg 0x1c: [mem 0x95480000-0x95483fff] Feb 13 09:51:59.554218 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 09:51:59.554261 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 13 09:51:59.554302 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 13 09:51:59.554346 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 13 09:51:59.554393 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 Feb 13 09:51:59.554436 kernel: pci 0000:04:00.0: reg 0x10: [mem 0x95300000-0x9537ffff] Feb 13 09:51:59.554482 kernel: pci 0000:04:00.0: reg 0x18: [io 0x4000-0x401f] Feb 13 09:51:59.554524 kernel: pci 0000:04:00.0: reg 0x1c: [mem 0x95380000-0x95383fff] Feb 13 09:51:59.554568 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Feb 13 09:51:59.554610 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 13 09:51:59.554653 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 13 09:51:59.554715 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 13 09:51:59.554757 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 13 09:51:59.554802 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 Feb 13 09:51:59.554846 kernel: pci 0000:06:00.0: enabling Extended Tags Feb 13 09:51:59.554888 kernel: pci 0000:06:00.0: supports D1 D2 Feb 13 09:51:59.554932 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 09:51:59.554973 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 13 09:51:59.555015 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 13 09:51:59.555057 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 13 09:51:59.555103 kernel: pci_bus 0000:07: extended config space not accessible Feb 13 09:51:59.555152 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 Feb 13 09:51:59.555196 kernel: pci 0000:07:00.0: reg 0x10: [mem 0x94000000-0x94ffffff] Feb 13 09:51:59.555241 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x95000000-0x9501ffff] Feb 13 09:51:59.555358 kernel: pci 0000:07:00.0: reg 0x18: [io 0x3000-0x307f] Feb 13 09:51:59.555404 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 09:51:59.555450 kernel: pci 0000:07:00.0: supports D1 D2 Feb 13 09:51:59.555518 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 09:51:59.555561 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 13 09:51:59.555603 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 13 09:51:59.555645 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 13 09:51:59.555652 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Feb 13 09:51:59.555658 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Feb 13 09:51:59.555665 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Feb 13 09:51:59.555670 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Feb 13 09:51:59.555676 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Feb 13 09:51:59.555681 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Feb 13 09:51:59.555686 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Feb 13 09:51:59.555691 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Feb 13 09:51:59.555697 kernel: iommu: Default domain type: Translated Feb 13 09:51:59.555702 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 09:51:59.555745 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Feb 13 09:51:59.555791 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 09:51:59.555837 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Feb 13 09:51:59.555844 kernel: vgaarb: loaded Feb 13 09:51:59.555850 kernel: pps_core: LinuxPPS API ver. 1 registered Feb 13 09:51:59.555855 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Feb 13 09:51:59.555860 kernel: PTP clock support registered Feb 13 09:51:59.555866 kernel: PCI: Using ACPI for IRQ routing Feb 13 09:51:59.555871 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 09:51:59.555876 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Feb 13 09:51:59.555883 kernel: e820: reserve RAM buffer [mem 0x81f89000-0x83ffffff] Feb 13 09:51:59.555888 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Feb 13 09:51:59.555893 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Feb 13 09:51:59.555898 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Feb 13 09:51:59.555903 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Feb 13 09:51:59.555908 kernel: clocksource: Switched to clocksource tsc-early Feb 13 09:51:59.555914 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 09:51:59.555919 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 09:51:59.555924 kernel: pnp: PnP ACPI init Feb 13 09:51:59.555967 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Feb 13 09:51:59.556011 kernel: pnp 00:02: [dma 0 disabled] Feb 13 09:51:59.556052 kernel: pnp 00:03: [dma 0 disabled] Feb 13 09:51:59.556094 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Feb 13 09:51:59.556133 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Feb 13 09:51:59.556172 kernel: system 00:05: [io 0x1854-0x1857] has been reserved Feb 13 09:51:59.556214 kernel: system 00:06: [mem 0xfed10000-0xfed17fff] has been reserved Feb 13 09:51:59.556251 kernel: system 00:06: [mem 0xfed18000-0xfed18fff] has been reserved Feb 13 09:51:59.556288 kernel: system 00:06: [mem 0xfed19000-0xfed19fff] has been reserved Feb 13 09:51:59.556325 kernel: system 00:06: [mem 0xe0000000-0xefffffff] has been reserved Feb 13 09:51:59.556362 kernel: system 00:06: [mem 0xfed20000-0xfed3ffff] has been reserved Feb 13 09:51:59.556398 kernel: system 00:06: [mem 0xfed90000-0xfed93fff] could not be reserved Feb 13 09:51:59.556434 kernel: system 00:06: [mem 0xfed45000-0xfed8ffff] has been reserved Feb 13 09:51:59.556498 kernel: system 00:06: [mem 0xfee00000-0xfeefffff] could not be reserved Feb 13 09:51:59.556557 kernel: system 00:07: [io 0x1800-0x18fe] could not be reserved Feb 13 09:51:59.556595 kernel: system 00:07: [mem 0xfd000000-0xfd69ffff] has been reserved Feb 13 09:51:59.556631 kernel: system 00:07: [mem 0xfd6c0000-0xfd6cffff] has been reserved Feb 13 09:51:59.556669 kernel: system 00:07: [mem 0xfd6f0000-0xfdffffff] has been reserved Feb 13 09:51:59.556705 kernel: system 00:07: [mem 0xfe000000-0xfe01ffff] could not be reserved Feb 13 09:51:59.556741 kernel: system 00:07: [mem 0xfe200000-0xfe7fffff] has been reserved Feb 13 09:51:59.556780 kernel: system 00:07: [mem 0xff000000-0xffffffff] has been reserved Feb 13 09:51:59.556820 kernel: system 00:08: [io 0x2000-0x20fe] has been reserved Feb 13 09:51:59.556827 kernel: pnp: PnP ACPI: found 10 devices Feb 13 09:51:59.556833 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 09:51:59.556838 kernel: NET: Registered PF_INET protocol family Feb 13 09:51:59.556844 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 09:51:59.556849 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Feb 13 09:51:59.556856 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 09:51:59.556861 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 09:51:59.556867 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Feb 13 09:51:59.556872 kernel: TCP: Hash tables configured (established 262144 bind 65536) Feb 13 09:51:59.556877 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 09:51:59.556882 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 09:51:59.556888 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 09:51:59.556893 kernel: NET: Registered PF_XDP protocol family Feb 13 09:51:59.556936 kernel: pci 0000:00:15.0: BAR 0: assigned [mem 0x95515000-0x95515fff 64bit] Feb 13 09:51:59.556979 kernel: pci 0000:00:15.1: BAR 0: assigned [mem 0x9551b000-0x9551bfff 64bit] Feb 13 09:51:59.557021 kernel: pci 0000:00:1e.0: BAR 0: assigned [mem 0x9551c000-0x9551cfff 64bit] Feb 13 09:51:59.557064 kernel: pci 0000:01:00.0: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 13 09:51:59.557108 kernel: pci 0000:01:00.0: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 13 09:51:59.557150 kernel: pci 0000:01:00.1: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 13 09:51:59.557193 kernel: pci 0000:01:00.1: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 13 09:51:59.557234 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 09:51:59.557275 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 13 09:51:59.557318 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 09:51:59.557359 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 13 09:51:59.557400 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 13 09:51:59.557441 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 13 09:51:59.557522 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 13 09:51:59.557565 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 13 09:51:59.557605 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 13 09:51:59.557647 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 13 09:51:59.557687 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 13 09:51:59.557730 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 13 09:51:59.557772 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 13 09:51:59.557814 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 13 09:51:59.557855 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 13 09:51:59.557898 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 13 09:51:59.557939 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 13 09:51:59.557975 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Feb 13 09:51:59.558014 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 13 09:51:59.558049 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 13 09:51:59.558086 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 13 09:51:59.558121 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Feb 13 09:51:59.558156 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Feb 13 09:51:59.558198 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Feb 13 09:51:59.558238 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 09:51:59.558282 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Feb 13 09:51:59.558320 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Feb 13 09:51:59.558363 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Feb 13 09:51:59.558400 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Feb 13 09:51:59.558445 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Feb 13 09:51:59.558508 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Feb 13 09:51:59.558548 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Feb 13 09:51:59.558588 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Feb 13 09:51:59.558596 kernel: PCI: CLS 64 bytes, default 64 Feb 13 09:51:59.558602 kernel: DMAR: No ATSR found Feb 13 09:51:59.558607 kernel: DMAR: No SATC found Feb 13 09:51:59.558613 kernel: DMAR: dmar0: Using Queued invalidation Feb 13 09:51:59.558656 kernel: pci 0000:00:00.0: Adding to iommu group 0 Feb 13 09:51:59.558699 kernel: pci 0000:00:01.0: Adding to iommu group 1 Feb 13 09:51:59.558741 kernel: pci 0000:00:08.0: Adding to iommu group 2 Feb 13 09:51:59.558784 kernel: pci 0000:00:12.0: Adding to iommu group 3 Feb 13 09:51:59.558825 kernel: pci 0000:00:14.0: Adding to iommu group 4 Feb 13 09:51:59.558867 kernel: pci 0000:00:14.2: Adding to iommu group 4 Feb 13 09:51:59.558907 kernel: pci 0000:00:15.0: Adding to iommu group 5 Feb 13 09:51:59.558948 kernel: pci 0000:00:15.1: Adding to iommu group 5 Feb 13 09:51:59.558992 kernel: pci 0000:00:16.0: Adding to iommu group 6 Feb 13 09:51:59.559033 kernel: pci 0000:00:16.1: Adding to iommu group 6 Feb 13 09:51:59.559075 kernel: pci 0000:00:16.4: Adding to iommu group 6 Feb 13 09:51:59.559116 kernel: pci 0000:00:17.0: Adding to iommu group 7 Feb 13 09:51:59.559160 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Feb 13 09:51:59.559200 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Feb 13 09:51:59.559241 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Feb 13 09:51:59.559283 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Feb 13 09:51:59.559324 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Feb 13 09:51:59.559367 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Feb 13 09:51:59.559408 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Feb 13 09:51:59.559450 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Feb 13 09:51:59.559493 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Feb 13 09:51:59.559538 kernel: pci 0000:01:00.0: Adding to iommu group 1 Feb 13 09:51:59.559581 kernel: pci 0000:01:00.1: Adding to iommu group 1 Feb 13 09:51:59.559624 kernel: pci 0000:03:00.0: Adding to iommu group 15 Feb 13 09:51:59.559670 kernel: pci 0000:04:00.0: Adding to iommu group 16 Feb 13 09:51:59.559727 kernel: pci 0000:06:00.0: Adding to iommu group 17 Feb 13 09:51:59.559773 kernel: pci 0000:07:00.0: Adding to iommu group 17 Feb 13 09:51:59.559780 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Feb 13 09:51:59.559786 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 13 09:51:59.559791 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Feb 13 09:51:59.559797 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Feb 13 09:51:59.559802 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Feb 13 09:51:59.559807 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Feb 13 09:51:59.559814 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Feb 13 09:51:59.559857 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Feb 13 09:51:59.559865 kernel: Initialise system trusted keyrings Feb 13 09:51:59.559871 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Feb 13 09:51:59.559876 kernel: Key type asymmetric registered Feb 13 09:51:59.559881 kernel: Asymmetric key parser 'x509' registered Feb 13 09:51:59.559886 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Feb 13 09:51:59.559892 kernel: io scheduler mq-deadline registered Feb 13 09:51:59.559898 kernel: io scheduler kyber registered Feb 13 09:51:59.559904 kernel: io scheduler bfq registered Feb 13 09:51:59.559945 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Feb 13 09:51:59.559986 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Feb 13 09:51:59.560028 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Feb 13 09:51:59.560069 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Feb 13 09:51:59.560111 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Feb 13 09:51:59.560151 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Feb 13 09:51:59.560198 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Feb 13 09:51:59.560206 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Feb 13 09:51:59.560212 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Feb 13 09:51:59.560217 kernel: pstore: Registered erst as persistent store backend Feb 13 09:51:59.560222 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 09:51:59.560228 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 09:51:59.560233 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 09:51:59.560238 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Feb 13 09:51:59.560245 kernel: hpet_acpi_add: no address or irqs in _CRS Feb 13 09:51:59.560288 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Feb 13 09:51:59.560296 kernel: i8042: PNP: No PS/2 controller found. Feb 13 09:51:59.560333 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Feb 13 09:51:59.560371 kernel: rtc_cmos rtc_cmos: registered as rtc0 Feb 13 09:51:59.560409 kernel: rtc_cmos rtc_cmos: setting system clock to 2024-02-13T09:51:58 UTC (1707817918) Feb 13 09:51:59.560446 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Feb 13 09:51:59.560456 kernel: fail to initialize ptp_kvm Feb 13 09:51:59.560463 kernel: intel_pstate: Intel P-state driver initializing Feb 13 09:51:59.560489 kernel: intel_pstate: Disabling energy efficiency optimization Feb 13 09:51:59.560494 kernel: intel_pstate: HWP enabled Feb 13 09:51:59.560500 kernel: vesafb: mode is 1024x768x8, linelength=1024, pages=0 Feb 13 09:51:59.560505 kernel: vesafb: scrolling: redraw Feb 13 09:51:59.560530 kernel: vesafb: Pseudocolor: size=0:8:8:8, shift=0:0:0:0 Feb 13 09:51:59.560535 kernel: vesafb: framebuffer at 0x94000000, mapped to 0x00000000f6aee0b1, using 768k, total 768k Feb 13 09:51:59.560540 kernel: Console: switching to colour frame buffer device 128x48 Feb 13 09:51:59.560547 kernel: fb0: VESA VGA frame buffer device Feb 13 09:51:59.560552 kernel: NET: Registered PF_INET6 protocol family Feb 13 09:51:59.560557 kernel: Segment Routing with IPv6 Feb 13 09:51:59.560562 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 09:51:59.560568 kernel: NET: Registered PF_PACKET protocol family Feb 13 09:51:59.560573 kernel: Key type dns_resolver registered Feb 13 09:51:59.560578 kernel: microcode: sig=0x906ed, pf=0x2, revision=0xf4 Feb 13 09:51:59.560583 kernel: microcode: Microcode Update Driver: v2.2. Feb 13 09:51:59.560588 kernel: IPI shorthand broadcast: enabled Feb 13 09:51:59.560594 kernel: sched_clock: Marking stable (1734718681, 1339384192)->(4492104211, -1418001338) Feb 13 09:51:59.560600 kernel: registered taskstats version 1 Feb 13 09:51:59.560605 kernel: Loading compiled-in X.509 certificates Feb 13 09:51:59.560610 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.148-flatcar: 253e5c5c936b12e2ff2626e7f3214deb753330c8' Feb 13 09:51:59.560615 kernel: Key type .fscrypt registered Feb 13 09:51:59.560620 kernel: Key type fscrypt-provisioning registered Feb 13 09:51:59.560626 kernel: pstore: Using crash dump compression: deflate Feb 13 09:51:59.560631 kernel: ima: Allocated hash algorithm: sha1 Feb 13 09:51:59.560636 kernel: ima: No architecture policies found Feb 13 09:51:59.560642 kernel: Freeing unused kernel image (initmem) memory: 45496K Feb 13 09:51:59.560648 kernel: Write protecting the kernel read-only data: 28672k Feb 13 09:51:59.560653 kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Feb 13 09:51:59.560659 kernel: Freeing unused kernel image (rodata/data gap) memory: 636K Feb 13 09:51:59.560664 kernel: Run /init as init process Feb 13 09:51:59.560669 kernel: with arguments: Feb 13 09:51:59.560674 kernel: /init Feb 13 09:51:59.560680 kernel: with environment: Feb 13 09:51:59.560685 kernel: HOME=/ Feb 13 09:51:59.560691 kernel: TERM=linux Feb 13 09:51:59.560696 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 09:51:59.560702 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 13 09:51:59.560709 systemd[1]: Detected architecture x86-64. Feb 13 09:51:59.560714 systemd[1]: Running in initrd. Feb 13 09:51:59.560720 systemd[1]: No hostname configured, using default hostname. Feb 13 09:51:59.560725 systemd[1]: Hostname set to . Feb 13 09:51:59.560730 systemd[1]: Initializing machine ID from random generator. Feb 13 09:51:59.560737 systemd[1]: Queued start job for default target initrd.target. Feb 13 09:51:59.560742 systemd[1]: Started systemd-ask-password-console.path. Feb 13 09:51:59.560747 systemd[1]: Reached target cryptsetup.target. Feb 13 09:51:59.560753 systemd[1]: Reached target paths.target. Feb 13 09:51:59.560758 systemd[1]: Reached target slices.target. Feb 13 09:51:59.560763 systemd[1]: Reached target swap.target. Feb 13 09:51:59.560769 systemd[1]: Reached target timers.target. Feb 13 09:51:59.560774 systemd[1]: Listening on iscsid.socket. Feb 13 09:51:59.560780 systemd[1]: Listening on iscsiuio.socket. Feb 13 09:51:59.560786 systemd[1]: Listening on systemd-journald-audit.socket. Feb 13 09:51:59.560791 systemd[1]: Listening on systemd-journald-dev-log.socket. Feb 13 09:51:59.560797 systemd[1]: Listening on systemd-journald.socket. Feb 13 09:51:59.560802 kernel: tsc: Refined TSC clocksource calibration: 3407.999 MHz Feb 13 09:51:59.560808 systemd[1]: Listening on systemd-networkd.socket. Feb 13 09:51:59.560813 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd336761, max_idle_ns: 440795243819 ns Feb 13 09:51:59.560818 kernel: clocksource: Switched to clocksource tsc Feb 13 09:51:59.560824 systemd[1]: Listening on systemd-udevd-control.socket. Feb 13 09:51:59.560830 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 13 09:51:59.560835 systemd[1]: Reached target sockets.target. Feb 13 09:51:59.560841 systemd[1]: Starting kmod-static-nodes.service... Feb 13 09:51:59.560846 systemd[1]: Finished network-cleanup.service. Feb 13 09:51:59.560851 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 09:51:59.560857 systemd[1]: Starting systemd-journald.service... Feb 13 09:51:59.560862 systemd[1]: Starting systemd-modules-load.service... Feb 13 09:51:59.560870 systemd-journald[268]: Journal started Feb 13 09:51:59.560895 systemd-journald[268]: Runtime Journal (/run/log/journal/b425ac389a174564a62754540eeffaff) is 8.0M, max 640.1M, 632.1M free. Feb 13 09:51:59.564187 systemd-modules-load[269]: Inserted module 'overlay' Feb 13 09:51:59.622543 kernel: audit: type=1334 audit(1707817919.569:2): prog-id=6 op=LOAD Feb 13 09:51:59.622554 systemd[1]: Starting systemd-resolved.service... Feb 13 09:51:59.622563 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 09:51:59.569000 audit: BPF prog-id=6 op=LOAD Feb 13 09:51:59.654472 kernel: Bridge firewalling registered Feb 13 09:51:59.654501 systemd[1]: Starting systemd-vconsole-setup.service... Feb 13 09:51:59.670219 systemd-modules-load[269]: Inserted module 'br_netfilter' Feb 13 09:51:59.706555 systemd[1]: Started systemd-journald.service. Feb 13 09:51:59.706567 kernel: SCSI subsystem initialized Feb 13 09:51:59.676555 systemd-resolved[271]: Positive Trust Anchors: Feb 13 09:51:59.822977 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 09:51:59.822992 kernel: audit: type=1130 audit(1707817919.726:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.823001 kernel: device-mapper: uevent: version 1.0.3 Feb 13 09:51:59.823008 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Feb 13 09:51:59.726000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.676561 systemd-resolved[271]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 09:51:59.891628 kernel: audit: type=1130 audit(1707817919.835:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.835000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.676580 systemd-resolved[271]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 13 09:51:59.964668 kernel: audit: type=1130 audit(1707817919.898:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.898000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.678143 systemd-resolved[271]: Defaulting to hostname 'linux'. Feb 13 09:51:59.972000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.728605 systemd[1]: Started systemd-resolved.service. Feb 13 09:52:00.069593 kernel: audit: type=1130 audit(1707817919.972:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:00.069606 kernel: audit: type=1130 audit(1707817920.023:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:00.023000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.823405 systemd-modules-load[269]: Inserted module 'dm_multipath' Feb 13 09:52:00.068000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.836741 systemd[1]: Finished kmod-static-nodes.service. Feb 13 09:52:00.132669 kernel: audit: type=1130 audit(1707817920.068:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:51:59.919907 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 09:51:59.993812 systemd[1]: Finished systemd-modules-load.service. Feb 13 09:52:00.024753 systemd[1]: Finished systemd-vconsole-setup.service. Feb 13 09:52:00.069730 systemd[1]: Reached target nss-lookup.target. Feb 13 09:52:00.126125 systemd[1]: Starting dracut-cmdline-ask.service... Feb 13 09:52:00.133078 systemd[1]: Starting systemd-sysctl.service... Feb 13 09:52:00.147102 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Feb 13 09:52:00.147812 systemd[1]: Finished systemd-sysctl.service. Feb 13 09:52:00.146000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:00.149745 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Feb 13 09:52:00.195656 kernel: audit: type=1130 audit(1707817920.146:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:00.209000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:00.210778 systemd[1]: Finished dracut-cmdline-ask.service. Feb 13 09:52:00.276555 kernel: audit: type=1130 audit(1707817920.209:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:00.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:00.269087 systemd[1]: Starting dracut-cmdline.service... Feb 13 09:52:00.290582 dracut-cmdline[292]: dracut-dracut-053 Feb 13 09:52:00.290582 dracut-cmdline[292]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LA Feb 13 09:52:00.290582 dracut-cmdline[292]: BEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 09:52:00.357587 kernel: Loading iSCSI transport class v2.0-870. Feb 13 09:52:00.357598 kernel: iscsi: registered transport (tcp) Feb 13 09:52:00.408266 kernel: iscsi: registered transport (qla4xxx) Feb 13 09:52:00.408283 kernel: QLogic iSCSI HBA Driver Feb 13 09:52:00.424351 systemd[1]: Finished dracut-cmdline.service. Feb 13 09:52:00.431000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:00.433166 systemd[1]: Starting dracut-pre-udev.service... Feb 13 09:52:00.488488 kernel: raid6: avx2x4 gen() 48106 MB/s Feb 13 09:52:00.523533 kernel: raid6: avx2x4 xor() 14961 MB/s Feb 13 09:52:00.558538 kernel: raid6: avx2x2 gen() 51896 MB/s Feb 13 09:52:00.593487 kernel: raid6: avx2x2 xor() 32025 MB/s Feb 13 09:52:00.628488 kernel: raid6: avx2x1 gen() 44551 MB/s Feb 13 09:52:00.662521 kernel: raid6: avx2x1 xor() 27828 MB/s Feb 13 09:52:00.696539 kernel: raid6: sse2x4 gen() 21369 MB/s Feb 13 09:52:00.730491 kernel: raid6: sse2x4 xor() 11963 MB/s Feb 13 09:52:00.764487 kernel: raid6: sse2x2 gen() 21571 MB/s Feb 13 09:52:00.798538 kernel: raid6: sse2x2 xor() 13401 MB/s Feb 13 09:52:00.832533 kernel: raid6: sse2x1 gen() 18244 MB/s Feb 13 09:52:00.884047 kernel: raid6: sse2x1 xor() 8928 MB/s Feb 13 09:52:00.884062 kernel: raid6: using algorithm avx2x2 gen() 51896 MB/s Feb 13 09:52:00.884069 kernel: raid6: .... xor() 32025 MB/s, rmw enabled Feb 13 09:52:00.902081 kernel: raid6: using avx2x2 recovery algorithm Feb 13 09:52:00.948489 kernel: xor: automatically using best checksumming function avx Feb 13 09:52:01.027492 kernel: Btrfs loaded, crc32c=crc32c-intel, zoned=no, fsverity=no Feb 13 09:52:01.032007 systemd[1]: Finished dracut-pre-udev.service. Feb 13 09:52:01.039000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:01.039000 audit: BPF prog-id=7 op=LOAD Feb 13 09:52:01.039000 audit: BPF prog-id=8 op=LOAD Feb 13 09:52:01.041383 systemd[1]: Starting systemd-udevd.service... Feb 13 09:52:01.049345 systemd-udevd[474]: Using default interface naming scheme 'v252'. Feb 13 09:52:01.070000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:01.056742 systemd[1]: Started systemd-udevd.service. Feb 13 09:52:01.095570 dracut-pre-trigger[487]: rd.md=0: removing MD RAID activation Feb 13 09:52:01.072127 systemd[1]: Starting dracut-pre-trigger.service... Feb 13 09:52:01.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:01.100510 systemd[1]: Finished dracut-pre-trigger.service. Feb 13 09:52:01.112633 systemd[1]: Starting systemd-udev-trigger.service... Feb 13 09:52:01.162458 systemd[1]: Finished systemd-udev-trigger.service. Feb 13 09:52:01.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:01.189462 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 09:52:01.191465 kernel: libata version 3.00 loaded. Feb 13 09:52:01.226543 kernel: ACPI: bus type USB registered Feb 13 09:52:01.226579 kernel: usbcore: registered new interface driver usbfs Feb 13 09:52:01.226591 kernel: usbcore: registered new interface driver hub Feb 13 09:52:01.244281 kernel: usbcore: registered new device driver usb Feb 13 09:52:01.285465 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 09:52:01.285511 kernel: ahci 0000:00:17.0: version 3.0 Feb 13 09:52:01.285623 kernel: AES CTR mode by8 optimization enabled Feb 13 09:52:01.285633 kernel: ahci 0000:00:17.0: AHCI 0001.0301 32 slots 7 ports 6 Gbps 0x7f impl SATA mode Feb 13 09:52:01.343089 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Feb 13 09:52:01.379298 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Feb 13 09:52:01.379369 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Feb 13 09:52:01.379382 kernel: scsi host0: ahci Feb 13 09:52:01.408122 kernel: scsi host1: ahci Feb 13 09:52:01.419596 kernel: mlx5_core 0000:01:00.0: firmware version: 14.31.1014 Feb 13 09:52:01.433546 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 13 09:52:01.440458 kernel: pps pps0: new PPS source ptp0 Feb 13 09:52:01.440536 kernel: scsi host2: ahci Feb 13 09:52:01.440603 kernel: scsi host3: ahci Feb 13 09:52:01.440670 kernel: scsi host4: ahci Feb 13 09:52:01.440732 kernel: scsi host5: ahci Feb 13 09:52:01.440792 kernel: scsi host6: ahci Feb 13 09:52:01.440851 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 Feb 13 09:52:01.440860 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 Feb 13 09:52:01.440868 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 Feb 13 09:52:01.440875 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 Feb 13 09:52:01.440885 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 Feb 13 09:52:01.440893 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 Feb 13 09:52:01.440901 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 Feb 13 09:52:01.627833 kernel: igb 0000:03:00.0: added PHC on eth0 Feb 13 09:52:01.628055 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 13 09:52:01.653740 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:75:2a Feb 13 09:52:01.653938 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Feb 13 09:52:01.678767 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 13 09:52:01.729134 kernel: pps pps1: new PPS source ptp1 Feb 13 09:52:01.729212 kernel: igb 0000:04:00.0: added PHC on eth1 Feb 13 09:52:01.729272 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(128) max mc(2048) Feb 13 09:52:01.729325 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 13 09:52:01.762456 kernel: ata7: SATA link down (SStatus 0 SControl 300) Feb 13 09:52:01.762474 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:75:2b Feb 13 09:52:01.762552 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 13 09:52:01.771454 kernel: ata3: SATA link down (SStatus 0 SControl 300) Feb 13 09:52:01.801735 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Feb 13 09:52:01.801835 kernel: ata4: SATA link down (SStatus 0 SControl 300) Feb 13 09:52:01.828028 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 13 09:52:01.828098 kernel: ata6: SATA link down (SStatus 0 SControl 300) Feb 13 09:52:01.871525 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 13 09:52:01.885494 kernel: ata5: SATA link down (SStatus 0 SControl 300) Feb 13 09:52:01.899513 kernel: ata2.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Feb 13 09:52:01.914528 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 13 09:52:01.929509 kernel: ata1.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU004, max UDMA/133 Feb 13 09:52:01.975333 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 13 09:52:01.975364 kernel: ata2.00: Features: NCQ-prio Feb 13 09:52:02.010251 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 13 09:52:02.010266 kernel: mlx5_core 0000:01:00.0: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 13 09:52:02.010355 kernel: ata1.00: Features: NCQ-prio Feb 13 09:52:02.023519 kernel: mlx5_core 0000:01:00.1: firmware version: 14.31.1014 Feb 13 09:52:02.051034 kernel: ata2.00: configured for UDMA/133 Feb 13 09:52:02.051049 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 13 09:52:02.069487 kernel: ata1.00: configured for UDMA/133 Feb 13 09:52:02.082503 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5200_MTFD U004 PQ: 0 ANSI: 5 Feb 13 09:52:02.117460 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Feb 13 09:52:02.133467 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Feb 13 09:52:02.133653 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 13 09:52:02.164318 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Feb 13 09:52:02.184458 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Feb 13 09:52:02.184634 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 13 09:52:02.216101 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Feb 13 09:52:02.216263 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Feb 13 09:52:02.246457 kernel: hub 1-0:1.0: USB hub found Feb 13 09:52:02.246681 kernel: hub 1-0:1.0: 16 ports detected Feb 13 09:52:02.260458 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 09:52:02.275059 kernel: hub 2-0:1.0: USB hub found Feb 13 09:52:02.275140 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 09:52:02.275149 kernel: sd 1:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 13 09:52:02.275210 kernel: sd 0:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 13 09:52:02.275281 kernel: sd 0:0:0:0: [sdb] 4096-byte physical blocks Feb 13 09:52:02.275335 kernel: sd 0:0:0:0: [sdb] Write Protect is off Feb 13 09:52:02.275388 kernel: sd 0:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Feb 13 09:52:02.275458 kernel: sd 0:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 13 09:52:02.275516 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 09:52:02.277458 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 13 09:52:02.277473 kernel: GPT:9289727 != 937703087 Feb 13 09:52:02.277480 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 13 09:52:02.277487 kernel: GPT:9289727 != 937703087 Feb 13 09:52:02.277493 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 13 09:52:02.277499 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 09:52:02.277505 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 09:52:02.277511 kernel: sd 0:0:0:0: [sdb] Attached SCSI disk Feb 13 09:52:02.303564 kernel: hub 2-0:1.0: 10 ports detected Feb 13 09:52:02.303640 kernel: sd 1:0:0:0: [sda] 4096-byte physical blocks Feb 13 09:52:02.339415 kernel: usb: port power management may be unreliable Feb 13 09:52:02.339429 kernel: sd 1:0:0:0: [sda] Write Protect is off Feb 13 09:52:02.404328 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(128) max mc(2048) Feb 13 09:52:02.404403 kernel: sd 1:0:0:0: [sda] Mode Sense: 00 3a 00 00 Feb 13 09:52:02.404465 kernel: sd 1:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 13 09:52:02.409506 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Feb 13 09:52:02.409579 kernel: port_module: 9 callbacks suppressed Feb 13 09:52:02.409589 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Feb 13 09:52:02.436515 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 09:52:02.510937 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Feb 13 09:52:02.525457 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 09:52:02.666783 kernel: hub 1-14:1.0: USB hub found Feb 13 09:52:02.666865 kernel: sd 1:0:0:0: [sda] Attached SCSI disk Feb 13 09:52:02.683494 kernel: hub 1-14:1.0: 4 ports detected Feb 13 09:52:02.683572 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 13 09:52:02.742342 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Feb 13 09:52:02.809544 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sdb6 scanned by (udev-worker) (528) Feb 13 09:52:02.802968 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Feb 13 09:52:02.819666 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Feb 13 09:52:02.846794 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Feb 13 09:52:02.870203 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Feb 13 09:52:02.879990 systemd[1]: Starting disk-uuid.service... Feb 13 09:52:02.922653 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 09:52:02.922699 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 09:52:02.922726 kernel: mlx5_core 0000:01:00.1: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 13 09:52:02.923145 disk-uuid[678]: Primary Header is updated. Feb 13 09:52:02.923145 disk-uuid[678]: Secondary Entries is updated. Feb 13 09:52:02.923145 disk-uuid[678]: Secondary Header is updated. Feb 13 09:52:03.010535 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 09:52:03.010545 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 09:52:03.010551 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 09:52:03.010558 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 09:52:03.035517 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth2 Feb 13 09:52:03.060462 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth0 Feb 13 09:52:03.060569 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Feb 13 09:52:03.217508 kernel: hid: raw HID events driver (C) Jiri Kosina Feb 13 09:52:03.250290 kernel: usbcore: registered new interface driver usbhid Feb 13 09:52:03.250309 kernel: usbhid: USB HID core driver Feb 13 09:52:03.284528 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Feb 13 09:52:03.403258 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Feb 13 09:52:03.403383 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Feb 13 09:52:03.403391 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Feb 13 09:52:04.000201 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 09:52:04.020366 disk-uuid[679]: The operation has completed successfully. Feb 13 09:52:04.029494 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 09:52:04.057906 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 09:52:04.154518 kernel: audit: type=1130 audit(1707817924.063:19): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.154533 kernel: audit: type=1131 audit(1707817924.063:20): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.063000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.063000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.057950 systemd[1]: Finished disk-uuid.service. Feb 13 09:52:04.184546 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 09:52:04.072976 systemd[1]: Starting verity-setup.service... Feb 13 09:52:04.216763 systemd[1]: Found device dev-mapper-usr.device. Feb 13 09:52:04.225449 systemd[1]: Mounting sysusr-usr.mount... Feb 13 09:52:04.231754 systemd[1]: Finished verity-setup.service. Feb 13 09:52:04.241000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.291478 kernel: audit: type=1130 audit(1707817924.241:21): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.348212 systemd[1]: Mounted sysusr-usr.mount. Feb 13 09:52:04.362553 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Feb 13 09:52:04.355730 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Feb 13 09:52:04.447775 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Feb 13 09:52:04.447788 kernel: BTRFS info (device sdb6): using free space tree Feb 13 09:52:04.447796 kernel: BTRFS info (device sdb6): has skinny extents Feb 13 09:52:04.447802 kernel: BTRFS info (device sdb6): enabling ssd optimizations Feb 13 09:52:04.356113 systemd[1]: Starting ignition-setup.service... Feb 13 09:52:04.377957 systemd[1]: Starting parse-ip-for-networkd.service... Feb 13 09:52:04.520551 kernel: audit: type=1130 audit(1707817924.470:22): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.470000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.455893 systemd[1]: Finished parse-ip-for-networkd.service. Feb 13 09:52:04.527000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.471796 systemd[1]: Finished ignition-setup.service. Feb 13 09:52:04.608507 kernel: audit: type=1130 audit(1707817924.527:23): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.608521 kernel: audit: type=1334 audit(1707817924.584:24): prog-id=9 op=LOAD Feb 13 09:52:04.584000 audit: BPF prog-id=9 op=LOAD Feb 13 09:52:04.529139 systemd[1]: Starting ignition-fetch-offline.service... Feb 13 09:52:04.586363 systemd[1]: Starting systemd-networkd.service... Feb 13 09:52:04.632000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.670908 ignition[869]: Ignition 2.14.0 Feb 13 09:52:04.698635 kernel: audit: type=1130 audit(1707817924.632:25): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.621890 systemd-networkd[881]: lo: Link UP Feb 13 09:52:04.670912 ignition[869]: Stage: fetch-offline Feb 13 09:52:04.621892 systemd-networkd[881]: lo: Gained carrier Feb 13 09:52:04.775579 kernel: audit: type=1130 audit(1707817924.723:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.723000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.670940 ignition[869]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 09:52:04.858192 kernel: audit: type=1130 audit(1707817924.782:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.858278 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 09:52:04.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.622174 systemd-networkd[881]: Enumeration completed Feb 13 09:52:04.883776 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f1np1: link becomes ready Feb 13 09:52:04.670955 ignition[869]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 09:52:04.622250 systemd[1]: Started systemd-networkd.service. Feb 13 09:52:04.678664 ignition[869]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 09:52:04.908000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.622850 systemd-networkd[881]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 09:52:04.936537 iscsid[906]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Feb 13 09:52:04.936537 iscsid[906]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log Feb 13 09:52:04.936537 iscsid[906]: into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Feb 13 09:52:04.936537 iscsid[906]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Feb 13 09:52:04.936537 iscsid[906]: If using hardware iscsi like qla4xxx this message can be ignored. Feb 13 09:52:04.936537 iscsid[906]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Feb 13 09:52:04.936537 iscsid[906]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Feb 13 09:52:04.943000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.678733 ignition[869]: parsed url from cmdline: "" Feb 13 09:52:04.633586 systemd[1]: Reached target network.target. Feb 13 09:52:04.678735 ignition[869]: no config URL provided Feb 13 09:52:05.111618 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 13 09:52:05.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:04.693960 systemd[1]: Starting iscsiuio.service... Feb 13 09:52:04.678738 ignition[869]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 09:52:04.698046 unknown[869]: fetched base config from "system" Feb 13 09:52:04.678768 ignition[869]: parsing config with SHA512: eb3c97db061c2d71000aa41259b544620b0b17400d3c303e000969cc655e81797ded4118264da43c83dd8a5632410432c27cad131cd13760d3d13b58eac93ab5 Feb 13 09:52:04.698050 unknown[869]: fetched user config from "system" Feb 13 09:52:04.698398 ignition[869]: fetch-offline: fetch-offline passed Feb 13 09:52:04.705662 systemd[1]: Started iscsiuio.service. Feb 13 09:52:04.698401 ignition[869]: POST message to Packet Timeline Feb 13 09:52:04.724818 systemd[1]: Finished ignition-fetch-offline.service. Feb 13 09:52:04.698405 ignition[869]: POST Status error: resource requires networking Feb 13 09:52:04.783704 systemd[1]: ignition-fetch.service was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 09:52:04.698435 ignition[869]: Ignition finished successfully Feb 13 09:52:04.784158 systemd[1]: Starting ignition-kargs.service... Feb 13 09:52:04.862613 ignition[895]: Ignition 2.14.0 Feb 13 09:52:04.859616 systemd-networkd[881]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 09:52:04.862617 ignition[895]: Stage: kargs Feb 13 09:52:04.872034 systemd[1]: Starting iscsid.service... Feb 13 09:52:04.862670 ignition[895]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 09:52:04.894635 systemd[1]: Started iscsid.service. Feb 13 09:52:04.862680 ignition[895]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 09:52:04.909959 systemd[1]: Starting dracut-initqueue.service... Feb 13 09:52:04.863996 ignition[895]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 09:52:04.919683 systemd[1]: Finished dracut-initqueue.service. Feb 13 09:52:04.865594 ignition[895]: kargs: kargs passed Feb 13 09:52:04.944573 systemd[1]: Reached target remote-fs-pre.target. Feb 13 09:52:04.865597 ignition[895]: POST message to Packet Timeline Feb 13 09:52:04.990639 systemd[1]: Reached target remote-cryptsetup.target. Feb 13 09:52:04.865607 ignition[895]: GET https://metadata.packet.net/metadata: attempt #1 Feb 13 09:52:05.017656 systemd[1]: Reached target remote-fs.target. Feb 13 09:52:04.868227 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:32929->[::1]:53: read: connection refused Feb 13 09:52:05.035521 systemd[1]: Starting dracut-pre-mount.service... Feb 13 09:52:05.068616 ignition[895]: GET https://metadata.packet.net/metadata: attempt #2 Feb 13 09:52:05.061669 systemd[1]: Finished dracut-pre-mount.service. Feb 13 09:52:05.068989 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:55826->[::1]:53: read: connection refused Feb 13 09:52:05.093934 systemd-networkd[881]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 09:52:05.122905 systemd-networkd[881]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 09:52:05.151881 systemd-networkd[881]: enp1s0f1np1: Link UP Feb 13 09:52:05.152094 systemd-networkd[881]: enp1s0f1np1: Gained carrier Feb 13 09:52:05.164856 systemd-networkd[881]: enp1s0f0np0: Link UP Feb 13 09:52:05.165130 systemd-networkd[881]: eno2: Link UP Feb 13 09:52:05.165384 systemd-networkd[881]: eno1: Link UP Feb 13 09:52:05.469143 ignition[895]: GET https://metadata.packet.net/metadata: attempt #3 Feb 13 09:52:05.470163 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:48590->[::1]:53: read: connection refused Feb 13 09:52:05.928182 systemd-networkd[881]: enp1s0f0np0: Gained carrier Feb 13 09:52:05.936551 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f0np0: link becomes ready Feb 13 09:52:05.957645 systemd-networkd[881]: enp1s0f0np0: DHCPv4 address 139.178.70.43/31, gateway 139.178.70.42 acquired from 145.40.83.140 Feb 13 09:52:06.270784 ignition[895]: GET https://metadata.packet.net/metadata: attempt #4 Feb 13 09:52:06.272021 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:53308->[::1]:53: read: connection refused Feb 13 09:52:06.672952 systemd-networkd[881]: enp1s0f1np1: Gained IPv6LL Feb 13 09:52:07.873845 ignition[895]: GET https://metadata.packet.net/metadata: attempt #5 Feb 13 09:52:07.875345 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:42004->[::1]:53: read: connection refused Feb 13 09:52:07.952845 systemd-networkd[881]: enp1s0f0np0: Gained IPv6LL Feb 13 09:52:11.078535 ignition[895]: GET https://metadata.packet.net/metadata: attempt #6 Feb 13 09:52:11.130042 ignition[895]: GET result: OK Feb 13 09:52:11.348884 ignition[895]: Ignition finished successfully Feb 13 09:52:11.353495 systemd[1]: Finished ignition-kargs.service. Feb 13 09:52:11.438748 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 09:52:11.438765 kernel: audit: type=1130 audit(1707817931.362:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:11.362000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:11.373042 ignition[925]: Ignition 2.14.0 Feb 13 09:52:11.365683 systemd[1]: Starting ignition-disks.service... Feb 13 09:52:11.373045 ignition[925]: Stage: disks Feb 13 09:52:11.373101 ignition[925]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 09:52:11.373110 ignition[925]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 09:52:11.375442 ignition[925]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 09:52:11.376179 ignition[925]: disks: disks passed Feb 13 09:52:11.376182 ignition[925]: POST message to Packet Timeline Feb 13 09:52:11.376192 ignition[925]: GET https://metadata.packet.net/metadata: attempt #1 Feb 13 09:52:11.399358 ignition[925]: GET result: OK Feb 13 09:52:11.616293 ignition[925]: Ignition finished successfully Feb 13 09:52:11.619354 systemd[1]: Finished ignition-disks.service. Feb 13 09:52:11.631000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:11.633051 systemd[1]: Reached target initrd-root-device.target. Feb 13 09:52:11.712697 kernel: audit: type=1130 audit(1707817931.631:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:11.697669 systemd[1]: Reached target local-fs-pre.target. Feb 13 09:52:11.697703 systemd[1]: Reached target local-fs.target. Feb 13 09:52:11.721665 systemd[1]: Reached target sysinit.target. Feb 13 09:52:11.735663 systemd[1]: Reached target basic.target. Feb 13 09:52:11.749321 systemd[1]: Starting systemd-fsck-root.service... Feb 13 09:52:11.768749 systemd-fsck[940]: ROOT: clean, 602/553520 files, 56013/553472 blocks Feb 13 09:52:11.781822 systemd[1]: Finished systemd-fsck-root.service. Feb 13 09:52:11.875574 kernel: audit: type=1130 audit(1707817931.789:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:11.875589 kernel: EXT4-fs (sdb9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Feb 13 09:52:11.789000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:11.795636 systemd[1]: Mounting sysroot.mount... Feb 13 09:52:11.883070 systemd[1]: Mounted sysroot.mount. Feb 13 09:52:11.897720 systemd[1]: Reached target initrd-root-fs.target. Feb 13 09:52:11.905362 systemd[1]: Mounting sysroot-usr.mount... Feb 13 09:52:11.926420 systemd[1]: Starting flatcar-metadata-hostname.service... Feb 13 09:52:11.941977 systemd[1]: Starting flatcar-static-network.service... Feb 13 09:52:11.958554 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 09:52:11.958602 systemd[1]: Reached target ignition-diskful.target. Feb 13 09:52:11.978303 systemd[1]: Mounted sysroot-usr.mount. Feb 13 09:52:12.002283 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 13 09:52:12.149245 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sdb6 scanned by mount (953) Feb 13 09:52:12.149262 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Feb 13 09:52:12.149272 kernel: BTRFS info (device sdb6): using free space tree Feb 13 09:52:12.149283 kernel: BTRFS info (device sdb6): has skinny extents Feb 13 09:52:12.149291 kernel: BTRFS info (device sdb6): enabling ssd optimizations Feb 13 09:52:12.149355 coreos-metadata[948]: Feb 13 09:52:12.075 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 09:52:12.149355 coreos-metadata[948]: Feb 13 09:52:12.105 INFO Fetch successful Feb 13 09:52:12.276317 kernel: audit: type=1130 audit(1707817932.156:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.276330 kernel: audit: type=1130 audit(1707817932.219:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.156000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.219000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.015164 systemd[1]: Starting initrd-setup-root.service... Feb 13 09:52:12.397153 kernel: audit: type=1130 audit(1707817932.283:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.397166 kernel: audit: type=1131 audit(1707817932.283:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.283000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.283000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.397206 coreos-metadata[947]: Feb 13 09:52:12.075 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 09:52:12.397206 coreos-metadata[947]: Feb 13 09:52:12.105 INFO Fetch successful Feb 13 09:52:12.397206 coreos-metadata[947]: Feb 13 09:52:12.123 INFO wrote hostname ci-3510.3.2-a-e401d5bc82 to /sysroot/etc/hostname Feb 13 09:52:12.077415 systemd[1]: Finished initrd-setup-root.service. Feb 13 09:52:12.460574 initrd-setup-root[958]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 09:52:12.468000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.158796 systemd[1]: Finished flatcar-metadata-hostname.service. Feb 13 09:52:12.533683 kernel: audit: type=1130 audit(1707817932.468:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.533746 initrd-setup-root[966]: cut: /sysroot/etc/group: No such file or directory Feb 13 09:52:12.220757 systemd[1]: flatcar-static-network.service: Deactivated successfully. Feb 13 09:52:12.553800 initrd-setup-root[974]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 09:52:12.220795 systemd[1]: Finished flatcar-static-network.service. Feb 13 09:52:12.571716 ignition[1022]: INFO : Ignition 2.14.0 Feb 13 09:52:12.571716 ignition[1022]: INFO : Stage: mount Feb 13 09:52:12.571716 ignition[1022]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 09:52:12.571716 ignition[1022]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 09:52:12.571716 ignition[1022]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 09:52:12.571716 ignition[1022]: INFO : mount: mount passed Feb 13 09:52:12.571716 ignition[1022]: INFO : POST message to Packet Timeline Feb 13 09:52:12.571716 ignition[1022]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 09:52:12.571716 ignition[1022]: INFO : GET result: OK Feb 13 09:52:12.662716 initrd-setup-root[982]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 09:52:12.284725 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 13 09:52:12.406041 systemd[1]: Starting ignition-mount.service... Feb 13 09:52:12.433018 systemd[1]: Starting sysroot-boot.service... Feb 13 09:52:12.453911 systemd[1]: sysusr-usr-share-oem.mount: Deactivated successfully. Feb 13 09:52:12.453966 systemd[1]: sysroot-usr-share-oem.mount: Deactivated successfully. Feb 13 09:52:12.457081 systemd[1]: Finished sysroot-boot.service. Feb 13 09:52:12.828837 ignition[1022]: INFO : Ignition finished successfully Feb 13 09:52:12.831365 systemd[1]: Finished ignition-mount.service. Feb 13 09:52:12.846000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.849423 systemd[1]: Starting ignition-files.service... Feb 13 09:52:12.919547 kernel: audit: type=1130 audit(1707817932.846:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:12.914229 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 13 09:52:12.977231 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sdb6 scanned by mount (1037) Feb 13 09:52:12.977247 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Feb 13 09:52:12.977255 kernel: BTRFS info (device sdb6): using free space tree Feb 13 09:52:13.000440 kernel: BTRFS info (device sdb6): has skinny extents Feb 13 09:52:13.049456 kernel: BTRFS info (device sdb6): enabling ssd optimizations Feb 13 09:52:13.050682 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 13 09:52:13.061919 unknown[1056]: wrote ssh authorized keys file for user: core Feb 13 09:52:13.073584 ignition[1056]: INFO : Ignition 2.14.0 Feb 13 09:52:13.073584 ignition[1056]: INFO : Stage: files Feb 13 09:52:13.073584 ignition[1056]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 09:52:13.073584 ignition[1056]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 09:52:13.073584 ignition[1056]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 09:52:13.073584 ignition[1056]: DEBUG : files: compiled without relabeling support, skipping Feb 13 09:52:13.073584 ignition[1056]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 09:52:13.073584 ignition[1056]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 09:52:13.073584 ignition[1056]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 09:52:13.073584 ignition[1056]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 09:52:13.073584 ignition[1056]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 09:52:13.073584 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 09:52:13.073584 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 13 09:52:13.235727 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 09:52:13.235727 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 09:52:13.235727 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.1.1.tgz" Feb 13 09:52:13.235727 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://github.com/containernetworking/plugins/releases/download/v1.1.1/cni-plugins-linux-amd64-v1.1.1.tgz: attempt #1 Feb 13 09:52:13.595925 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Feb 13 09:52:13.675971 ignition[1056]: DEBUG : files: createFilesystemsFiles: createFiles: op(4): file matches expected sum of: 4d0ed0abb5951b9cf83cba938ef84bdc5b681f4ac869da8143974f6a53a3ff30c666389fa462b9d14d30af09bf03f6cdf77598c572f8fb3ea00cecdda467a48d Feb 13 09:52:13.701702 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.1.1.tgz" Feb 13 09:52:13.701702 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/opt/crictl-v1.26.0-linux-amd64.tar.gz" Feb 13 09:52:13.701702 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET https://github.com/kubernetes-sigs/cri-tools/releases/download/v1.26.0/crictl-v1.26.0-linux-amd64.tar.gz: attempt #1 Feb 13 09:52:14.108751 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET result: OK Feb 13 09:52:14.159014 ignition[1056]: DEBUG : files: createFilesystemsFiles: createFiles: op(5): file matches expected sum of: a3a2c02a90b008686c20babaf272e703924db2a3e2a0d4e2a7c81d994cbc68c47458a4a354ecc243af095b390815c7f203348b9749351ae817bd52a522300449 Feb 13 09:52:14.182738 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/opt/crictl-v1.26.0-linux-amd64.tar.gz" Feb 13 09:52:14.182738 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/bin/kubeadm" Feb 13 09:52:14.182738 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubeadm: attempt #1 Feb 13 09:52:14.232651 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 13 09:52:14.408605 ignition[1056]: DEBUG : files: createFilesystemsFiles: createFiles: op(6): file matches expected sum of: 1c324cd645a7bf93d19d24c87498d9a17878eb1cc927e2680200ffeab2f85051ddec47d85b79b8e774042dc6726299ad3d7caf52c060701f00deba30dc33f660 Feb 13 09:52:14.433709 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/bin/kubeadm" Feb 13 09:52:14.433709 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/opt/bin/kubelet" Feb 13 09:52:14.433709 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubelet: attempt #1 Feb 13 09:52:14.481657 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET result: OK Feb 13 09:52:14.834446 ignition[1056]: DEBUG : files: createFilesystemsFiles: createFiles: op(7): file matches expected sum of: 40daf2a9b9e666c14b10e627da931bd79978628b1f23ef6429c1cb4fcba261f86ccff440c0dbb0070ee760fe55772b4fd279c4582dfbb17fa30bc94b7f00126b Feb 13 09:52:14.834446 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/opt/bin/kubelet" Feb 13 09:52:14.875674 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/opt/bin/kubectl" Feb 13 09:52:14.875674 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubectl: attempt #1 Feb 13 09:52:14.906517 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET result: OK Feb 13 09:52:15.010674 ignition[1056]: DEBUG : files: createFilesystemsFiles: createFiles: op(8): file matches expected sum of: 97840854134909d75a1a2563628cc4ba632067369ce7fc8a8a1e90a387d32dd7bfd73f4f5b5a82ef842088e7470692951eb7fc869c5f297dd740f855672ee628 Feb 13 09:52:15.010674 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/opt/bin/kubectl" Feb 13 09:52:15.078658 kernel: BTRFS info: devid 1 device path /dev/sdb6 changed to /dev/disk/by-label/OEM scanned by ignition (1075) Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/docker/daemon.json" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/docker/daemon.json" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/home/core/install.sh" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(c): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(c): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(d): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(d): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(e): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(e): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(f): [started] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(f): oem config not found in "/usr/share/oem", looking on oem partition Feb 13 09:52:15.078673 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(10): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2314855539" Feb 13 09:52:15.078673 ignition[1056]: CRITICAL : files: createFilesystemsFiles: createFiles: op(f): op(10): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2314855539": device or resource busy Feb 13 09:52:15.404708 kernel: audit: type=1130 audit(1707817935.315:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.315000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.305220 systemd[1]: Finished ignition-files.service. Feb 13 09:52:15.419617 ignition[1056]: ERROR : files: createFilesystemsFiles: createFiles: op(f): failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem2314855539", trying btrfs: device or resource busy Feb 13 09:52:15.419617 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(11): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2314855539" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(11): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2314855539" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(12): [started] unmounting "/mnt/oem2314855539" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(12): [finished] unmounting "/mnt/oem2314855539" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: createFilesystemsFiles: createFiles: op(f): [finished] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(13): [started] processing unit "coreos-metadata-sshkeys@.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(13): [finished] processing unit "coreos-metadata-sshkeys@.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(14): [started] processing unit "packet-phone-home.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(14): [finished] processing unit "packet-phone-home.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(15): [started] processing unit "prepare-cni-plugins.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(15): op(16): [started] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(15): op(16): [finished] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(15): [finished] processing unit "prepare-cni-plugins.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(17): [started] processing unit "prepare-critools.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(17): op(18): [started] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 13 09:52:15.419617 ignition[1056]: INFO : files: op(17): op(18): [finished] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 13 09:52:15.418000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.455000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.540000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.627000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.747000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.322290 systemd[1]: Starting initrd-setup-root-after-ignition.service... Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(17): [finished] processing unit "prepare-critools.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(19): [started] processing unit "prepare-helm.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(19): op(1a): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(19): op(1a): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(19): [finished] processing unit "prepare-helm.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1b): [started] setting preset to enabled for "prepare-cni-plugins.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1b): [finished] setting preset to enabled for "prepare-cni-plugins.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1c): [started] setting preset to enabled for "prepare-critools.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1c): [finished] setting preset to enabled for "prepare-critools.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1d): [started] setting preset to enabled for "prepare-helm.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1d): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1e): [started] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1e): [finished] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1f): [started] setting preset to enabled for "packet-phone-home.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: op(1f): [finished] setting preset to enabled for "packet-phone-home.service" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: createResultFile: createFiles: op(20): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: createResultFile: createFiles: op(20): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 09:52:15.807780 ignition[1056]: INFO : files: files passed Feb 13 09:52:15.807780 ignition[1056]: INFO : POST message to Packet Timeline Feb 13 09:52:15.807780 ignition[1056]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 09:52:15.807780 ignition[1056]: INFO : GET result: OK Feb 13 09:52:15.807780 ignition[1056]: INFO : Ignition finished successfully Feb 13 09:52:16.020000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.064000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.087000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.201000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.216000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.232000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.272857 initrd-setup-root-after-ignition[1089]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 09:52:16.279000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.383653 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Feb 13 09:52:16.317787 iscsid[906]: iscsid shutting down. Feb 13 09:52:16.324000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.383958 systemd[1]: Starting ignition-quench.service... Feb 13 09:52:16.339000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.411815 systemd[1]: Finished initrd-setup-root-after-ignition.service. Feb 13 09:52:15.419792 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 09:52:16.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.419841 systemd[1]: Finished ignition-quench.service. Feb 13 09:52:15.456827 systemd[1]: Reached target ignition-complete.target. Feb 13 09:52:15.485377 systemd[1]: Starting initrd-parse-etc.service... Feb 13 09:52:16.490422 kernel: kauditd_printk_skb: 17 callbacks suppressed Feb 13 09:52:16.490585 kernel: audit: type=1131 audit(1707817936.408:58): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.408000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.490623 ignition[1104]: INFO : Ignition 2.14.0 Feb 13 09:52:16.490623 ignition[1104]: INFO : Stage: umount Feb 13 09:52:16.490623 ignition[1104]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 09:52:16.490623 ignition[1104]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 09:52:16.490623 ignition[1104]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 09:52:16.490623 ignition[1104]: INFO : umount: umount passed Feb 13 09:52:16.490623 ignition[1104]: INFO : POST message to Packet Timeline Feb 13 09:52:16.490623 ignition[1104]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 09:52:16.490623 ignition[1104]: INFO : GET result: OK Feb 13 09:52:16.490623 ignition[1104]: INFO : Ignition finished successfully Feb 13 09:52:17.064706 kernel: audit: type=1130 audit(1707817936.497:59): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.064724 kernel: audit: type=1131 audit(1707817936.497:60): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.064735 kernel: audit: type=1131 audit(1707817936.618:61): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.064742 kernel: audit: type=1131 audit(1707817936.681:62): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.064823 kernel: audit: type=1131 audit(1707817936.785:63): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.064830 kernel: audit: type=1131 audit(1707817936.852:64): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.064837 kernel: audit: type=1131 audit(1707817936.919:65): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.064844 kernel: audit: type=1131 audit(1707817936.987:66): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.497000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.497000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.618000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.681000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.785000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.852000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.919000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.987000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.507944 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 09:52:15.507998 systemd[1]: Finished initrd-parse-etc.service. Feb 13 09:52:17.088000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.541956 systemd[1]: Reached target initrd-fs.target. Feb 13 09:52:17.164543 kernel: audit: type=1131 audit(1707817937.088:67): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.156000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.157000 audit: BPF prog-id=6 op=UNLOAD Feb 13 09:52:15.567749 systemd[1]: Reached target initrd.target. Feb 13 09:52:15.587867 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Feb 13 09:52:15.589909 systemd[1]: Starting dracut-pre-pivot.service... Feb 13 09:52:17.205000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.616753 systemd[1]: Finished dracut-pre-pivot.service. Feb 13 09:52:17.221000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.629558 systemd[1]: Starting initrd-cleanup.service... Feb 13 09:52:17.237000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.665198 systemd[1]: Stopped target nss-lookup.target. Feb 13 09:52:15.681027 systemd[1]: Stopped target remote-cryptsetup.target. Feb 13 09:52:17.260000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.709197 systemd[1]: Stopped target timers.target. Feb 13 09:52:15.729013 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 09:52:15.729343 systemd[1]: Stopped dracut-pre-pivot.service. Feb 13 09:52:17.316000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.749317 systemd[1]: Stopped target initrd.target. Feb 13 09:52:17.331000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.773159 systemd[1]: Stopped target basic.target. Feb 13 09:52:17.346000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.798192 systemd[1]: Stopped target ignition-complete.target. Feb 13 09:52:15.817022 systemd[1]: Stopped target ignition-diskful.target. Feb 13 09:52:17.375000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.837039 systemd[1]: Stopped target initrd-root-device.target. Feb 13 09:52:17.391000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:17.391000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:15.857053 systemd[1]: Stopped target remote-fs.target. Feb 13 09:52:15.884041 systemd[1]: Stopped target remote-fs-pre.target. Feb 13 09:52:15.911056 systemd[1]: Stopped target sysinit.target. Feb 13 09:52:15.932069 systemd[1]: Stopped target local-fs.target. Feb 13 09:52:15.956052 systemd[1]: Stopped target local-fs-pre.target. Feb 13 09:52:15.980042 systemd[1]: Stopped target swap.target. Feb 13 09:52:15.999928 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 09:52:16.000279 systemd[1]: Stopped dracut-pre-mount.service. Feb 13 09:52:16.022255 systemd[1]: Stopped target cryptsetup.target. Feb 13 09:52:16.043937 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 09:52:16.044291 systemd[1]: Stopped dracut-initqueue.service. Feb 13 09:52:16.066337 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 09:52:16.066717 systemd[1]: Stopped ignition-fetch-offline.service. Feb 13 09:52:16.089223 systemd[1]: Stopped target paths.target. Feb 13 09:52:16.110907 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 09:52:16.114719 systemd[1]: Stopped systemd-ask-password-console.path. Feb 13 09:52:16.133021 systemd[1]: Stopped target slices.target. Feb 13 09:52:16.153027 systemd[1]: Stopped target sockets.target. Feb 13 09:52:16.177031 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 09:52:16.177373 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Feb 13 09:52:16.203258 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 09:52:17.563000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:16.203631 systemd[1]: Stopped ignition-files.service. Feb 13 09:52:16.218080 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Feb 13 09:52:16.218398 systemd[1]: Stopped flatcar-metadata-hostname.service. Feb 13 09:52:16.236006 systemd[1]: Stopping ignition-mount.service... Feb 13 09:52:16.250679 systemd[1]: Stopping iscsid.service... Feb 13 09:52:16.264624 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 09:52:16.264721 systemd[1]: Stopped kmod-static-nodes.service. Feb 13 09:52:16.281373 systemd[1]: Stopping sysroot-boot.service... Feb 13 09:52:16.301636 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 09:52:16.301913 systemd[1]: Stopped systemd-udev-trigger.service. Feb 13 09:52:16.325991 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 09:52:16.326284 systemd[1]: Stopped dracut-pre-trigger.service. Feb 13 09:52:16.347691 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 09:52:16.349279 systemd[1]: iscsid.service: Deactivated successfully. Feb 13 09:52:16.349498 systemd[1]: Stopped iscsid.service. Feb 13 09:52:16.364929 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 09:52:16.365084 systemd[1]: Closed iscsid.socket. Feb 13 09:52:16.379817 systemd[1]: Stopping iscsiuio.service... Feb 13 09:52:16.395167 systemd[1]: iscsiuio.service: Deactivated successfully. Feb 13 09:52:16.395367 systemd[1]: Stopped iscsiuio.service. Feb 13 09:52:16.410280 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 09:52:16.410494 systemd[1]: Finished initrd-cleanup.service. Feb 13 09:52:16.498827 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 09:52:16.498863 systemd[1]: Stopped ignition-mount.service. Feb 13 09:52:16.619739 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 09:52:16.619777 systemd[1]: Stopped sysroot-boot.service. Feb 13 09:52:16.683040 systemd[1]: Stopped target network.target. Feb 13 09:52:16.748666 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 09:52:16.748684 systemd[1]: Closed iscsiuio.socket. Feb 13 09:52:16.774709 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 09:52:16.774823 systemd[1]: Stopped ignition-disks.service. Feb 13 09:52:16.786832 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 09:52:17.678489 systemd-journald[268]: Received SIGTERM from PID 1 (n/a). Feb 13 09:52:16.786871 systemd[1]: Stopped ignition-kargs.service. Feb 13 09:52:16.873684 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 09:52:16.873721 systemd[1]: Stopped ignition-setup.service. Feb 13 09:52:16.940669 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 09:52:16.940725 systemd[1]: Stopped initrd-setup-root.service. Feb 13 09:52:16.988787 systemd[1]: Stopping systemd-networkd.service... Feb 13 09:52:17.053671 systemd-networkd[881]: enp1s0f1np1: DHCPv6 lease lost Feb 13 09:52:17.055804 systemd[1]: Stopping systemd-resolved.service... Feb 13 09:52:17.061474 systemd-networkd[881]: enp1s0f0np0: DHCPv6 lease lost Feb 13 09:52:17.677000 audit: BPF prog-id=9 op=UNLOAD Feb 13 09:52:17.071798 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 09:52:17.071841 systemd[1]: Stopped systemd-resolved.service. Feb 13 09:52:17.090621 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 09:52:17.090668 systemd[1]: Stopped systemd-networkd.service. Feb 13 09:52:17.157764 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 09:52:17.157780 systemd[1]: Closed systemd-networkd.socket. Feb 13 09:52:17.172963 systemd[1]: Stopping network-cleanup.service... Feb 13 09:52:17.187665 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 09:52:17.187714 systemd[1]: Stopped parse-ip-for-networkd.service. Feb 13 09:52:17.206657 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 09:52:17.206691 systemd[1]: Stopped systemd-sysctl.service. Feb 13 09:52:17.222901 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 09:52:17.222958 systemd[1]: Stopped systemd-modules-load.service. Feb 13 09:52:17.238974 systemd[1]: Stopping systemd-udevd.service... Feb 13 09:52:17.257108 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 09:52:17.257933 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 09:52:17.257992 systemd[1]: Stopped systemd-udevd.service. Feb 13 09:52:17.262771 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 09:52:17.262794 systemd[1]: Closed systemd-udevd-control.socket. Feb 13 09:52:17.284620 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 09:52:17.284650 systemd[1]: Closed systemd-udevd-kernel.socket. Feb 13 09:52:17.301683 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 09:52:17.301754 systemd[1]: Stopped dracut-pre-udev.service. Feb 13 09:52:17.317824 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 09:52:17.317941 systemd[1]: Stopped dracut-cmdline.service. Feb 13 09:52:17.332848 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 09:52:17.332981 systemd[1]: Stopped dracut-cmdline-ask.service. Feb 13 09:52:17.349672 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Feb 13 09:52:17.362540 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 09:52:17.362568 systemd[1]: Stopped systemd-vconsole-setup.service. Feb 13 09:52:17.376771 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 09:52:17.376822 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Feb 13 09:52:17.553972 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 09:52:17.554190 systemd[1]: Stopped network-cleanup.service. Feb 13 09:52:17.565039 systemd[1]: Reached target initrd-switch-root.target. Feb 13 09:52:17.583385 systemd[1]: Starting initrd-switch-root.service... Feb 13 09:52:17.618307 systemd[1]: Switching root. Feb 13 09:52:17.680240 systemd-journald[268]: Journal stopped Feb 13 09:52:21.330584 kernel: SELinux: Class mctp_socket not defined in policy. Feb 13 09:52:21.330597 kernel: SELinux: Class anon_inode not defined in policy. Feb 13 09:52:21.330606 kernel: SELinux: the above unknown classes and permissions will be allowed Feb 13 09:52:21.330611 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 09:52:21.330616 kernel: SELinux: policy capability open_perms=1 Feb 13 09:52:21.330621 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 09:52:21.330627 kernel: SELinux: policy capability always_check_network=0 Feb 13 09:52:21.330633 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 09:52:21.330638 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 09:52:21.330644 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 09:52:21.330649 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 09:52:21.330655 systemd[1]: Successfully loaded SELinux policy in 325.965ms. Feb 13 09:52:21.330662 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 5.908ms. Feb 13 09:52:21.330669 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 13 09:52:21.330676 systemd[1]: Detected architecture x86-64. Feb 13 09:52:21.330682 systemd[1]: Detected first boot. Feb 13 09:52:21.330688 systemd[1]: Hostname set to . Feb 13 09:52:21.330694 systemd[1]: Initializing machine ID from random generator. Feb 13 09:52:21.330700 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Feb 13 09:52:21.330706 systemd[1]: Populated /etc with preset unit settings. Feb 13 09:52:21.330712 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 09:52:21.330719 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 09:52:21.330726 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 09:52:21.330732 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 09:52:21.330738 systemd[1]: Stopped initrd-switch-root.service. Feb 13 09:52:21.330744 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 09:52:21.330750 systemd[1]: Created slice system-addon\x2dconfig.slice. Feb 13 09:52:21.330758 systemd[1]: Created slice system-addon\x2drun.slice. Feb 13 09:52:21.330764 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice. Feb 13 09:52:21.330770 systemd[1]: Created slice system-getty.slice. Feb 13 09:52:21.330776 systemd[1]: Created slice system-modprobe.slice. Feb 13 09:52:21.330782 systemd[1]: Created slice system-serial\x2dgetty.slice. Feb 13 09:52:21.330788 systemd[1]: Created slice system-system\x2dcloudinit.slice. Feb 13 09:52:21.330794 systemd[1]: Created slice system-systemd\x2dfsck.slice. Feb 13 09:52:21.330800 systemd[1]: Created slice user.slice. Feb 13 09:52:21.330806 systemd[1]: Started systemd-ask-password-console.path. Feb 13 09:52:21.330813 systemd[1]: Started systemd-ask-password-wall.path. Feb 13 09:52:21.330819 systemd[1]: Set up automount boot.automount. Feb 13 09:52:21.330825 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Feb 13 09:52:21.330831 systemd[1]: Stopped target initrd-switch-root.target. Feb 13 09:52:21.330839 systemd[1]: Stopped target initrd-fs.target. Feb 13 09:52:21.330845 systemd[1]: Stopped target initrd-root-fs.target. Feb 13 09:52:21.330851 systemd[1]: Reached target integritysetup.target. Feb 13 09:52:21.330857 systemd[1]: Reached target remote-cryptsetup.target. Feb 13 09:52:21.330865 systemd[1]: Reached target remote-fs.target. Feb 13 09:52:21.330871 systemd[1]: Reached target slices.target. Feb 13 09:52:21.330877 systemd[1]: Reached target swap.target. Feb 13 09:52:21.330883 systemd[1]: Reached target torcx.target. Feb 13 09:52:21.330889 systemd[1]: Reached target veritysetup.target. Feb 13 09:52:21.330895 systemd[1]: Listening on systemd-coredump.socket. Feb 13 09:52:21.330901 systemd[1]: Listening on systemd-initctl.socket. Feb 13 09:52:21.330907 systemd[1]: Listening on systemd-networkd.socket. Feb 13 09:52:21.330915 systemd[1]: Listening on systemd-udevd-control.socket. Feb 13 09:52:21.330922 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 13 09:52:21.330928 systemd[1]: Listening on systemd-userdbd.socket. Feb 13 09:52:21.330935 systemd[1]: Mounting dev-hugepages.mount... Feb 13 09:52:21.330941 systemd[1]: Mounting dev-mqueue.mount... Feb 13 09:52:21.330947 systemd[1]: Mounting media.mount... Feb 13 09:52:21.330955 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 09:52:21.330961 systemd[1]: Mounting sys-kernel-debug.mount... Feb 13 09:52:21.330968 systemd[1]: Mounting sys-kernel-tracing.mount... Feb 13 09:52:21.330974 systemd[1]: Mounting tmp.mount... Feb 13 09:52:21.330980 systemd[1]: Starting flatcar-tmpfiles.service... Feb 13 09:52:21.330987 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Feb 13 09:52:21.330993 systemd[1]: Starting kmod-static-nodes.service... Feb 13 09:52:21.330999 systemd[1]: Starting modprobe@configfs.service... Feb 13 09:52:21.331005 systemd[1]: Starting modprobe@dm_mod.service... Feb 13 09:52:21.331013 systemd[1]: Starting modprobe@drm.service... Feb 13 09:52:21.331019 systemd[1]: Starting modprobe@efi_pstore.service... Feb 13 09:52:21.331026 systemd[1]: Starting modprobe@fuse.service... Feb 13 09:52:21.331032 kernel: fuse: init (API version 7.34) Feb 13 09:52:21.331038 systemd[1]: Starting modprobe@loop.service... Feb 13 09:52:21.331044 kernel: loop: module loaded Feb 13 09:52:21.331050 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 09:52:21.331057 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 09:52:21.331064 systemd[1]: Stopped systemd-fsck-root.service. Feb 13 09:52:21.331071 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 09:52:21.331077 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 09:52:21.331083 systemd[1]: Stopped systemd-journald.service. Feb 13 09:52:21.331089 systemd[1]: Starting systemd-journald.service... Feb 13 09:52:21.331096 systemd[1]: Starting systemd-modules-load.service... Feb 13 09:52:21.331105 systemd-journald[1256]: Journal started Feb 13 09:52:21.331131 systemd-journald[1256]: Runtime Journal (/run/log/journal/14537728e64448fc8d2a0e6f5ba05368) is 8.0M, max 640.1M, 632.1M free. Feb 13 09:52:18.090000 audit: MAC_POLICY_LOAD auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 09:52:18.359000 audit[1]: AVC avc: denied { integrity } for pid=1 comm="systemd" lockdown_reason="/dev/mem,kmem,port" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 13 09:52:18.361000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 13 09:52:18.361000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 13 09:52:18.361000 audit: BPF prog-id=10 op=LOAD Feb 13 09:52:18.361000 audit: BPF prog-id=10 op=UNLOAD Feb 13 09:52:18.361000 audit: BPF prog-id=11 op=LOAD Feb 13 09:52:18.361000 audit: BPF prog-id=11 op=UNLOAD Feb 13 09:52:18.428000 audit[1146]: AVC avc: denied { associate } for pid=1146 comm="torcx-generator" name="docker" dev="tmpfs" ino=2 scontext=system_u:object_r:unlabeled_t:s0 tcontext=system_u:object_r:tmpfs_t:s0 tclass=filesystem permissive=1 srawcon="system_u:object_r:container_file_t:s0:c1022,c1023" Feb 13 09:52:18.428000 audit[1146]: SYSCALL arch=c000003e syscall=188 success=yes exit=0 a0=c0001a58dc a1=c00002ce58 a2=c00002bb00 a3=32 items=0 ppid=1129 pid=1146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="torcx-generator" exe="/usr/lib/systemd/system-generators/torcx-generator" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:52:18.428000 audit: PROCTITLE proctitle=2F7573722F6C69622F73797374656D642F73797374656D2D67656E657261746F72732F746F7263782D67656E657261746F72002F72756E2F73797374656D642F67656E657261746F72002F72756E2F73797374656D642F67656E657261746F722E6561726C79002F72756E2F73797374656D642F67656E657261746F722E6C61 Feb 13 09:52:18.454000 audit[1146]: AVC avc: denied { associate } for pid=1146 comm="torcx-generator" name="lib" scontext=system_u:object_r:unlabeled_t:s0 tcontext=system_u:object_r:tmpfs_t:s0 tclass=filesystem permissive=1 Feb 13 09:52:18.454000 audit[1146]: SYSCALL arch=c000003e syscall=258 success=yes exit=0 a0=ffffffffffffff9c a1=c0001a59b5 a2=1ed a3=0 items=2 ppid=1129 pid=1146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="torcx-generator" exe="/usr/lib/systemd/system-generators/torcx-generator" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:52:18.454000 audit: CWD cwd="/" Feb 13 09:52:18.454000 audit: PATH item=0 name=(null) inode=2 dev=00:1b mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:unlabeled_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:18.454000 audit: PATH item=1 name=(null) inode=3 dev=00:1b mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:unlabeled_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:18.454000 audit: PROCTITLE proctitle=2F7573722F6C69622F73797374656D642F73797374656D2D67656E657261746F72732F746F7263782D67656E657261746F72002F72756E2F73797374656D642F67656E657261746F72002F72756E2F73797374656D642F67656E657261746F722E6561726C79002F72756E2F73797374656D642F67656E657261746F722E6C61 Feb 13 09:52:19.989000 audit: BPF prog-id=12 op=LOAD Feb 13 09:52:19.989000 audit: BPF prog-id=3 op=UNLOAD Feb 13 09:52:19.989000 audit: BPF prog-id=13 op=LOAD Feb 13 09:52:19.990000 audit: BPF prog-id=14 op=LOAD Feb 13 09:52:19.990000 audit: BPF prog-id=4 op=UNLOAD Feb 13 09:52:19.990000 audit: BPF prog-id=5 op=UNLOAD Feb 13 09:52:19.990000 audit: BPF prog-id=15 op=LOAD Feb 13 09:52:19.990000 audit: BPF prog-id=12 op=UNLOAD Feb 13 09:52:19.991000 audit: BPF prog-id=16 op=LOAD Feb 13 09:52:19.991000 audit: BPF prog-id=17 op=LOAD Feb 13 09:52:19.991000 audit: BPF prog-id=13 op=UNLOAD Feb 13 09:52:19.991000 audit: BPF prog-id=14 op=UNLOAD Feb 13 09:52:19.991000 audit: BPF prog-id=18 op=LOAD Feb 13 09:52:19.991000 audit: BPF prog-id=15 op=UNLOAD Feb 13 09:52:19.991000 audit: BPF prog-id=19 op=LOAD Feb 13 09:52:19.991000 audit: BPF prog-id=20 op=LOAD Feb 13 09:52:19.991000 audit: BPF prog-id=16 op=UNLOAD Feb 13 09:52:19.991000 audit: BPF prog-id=17 op=UNLOAD Feb 13 09:52:19.991000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:20.045000 audit: BPF prog-id=18 op=UNLOAD Feb 13 09:52:20.046000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:20.046000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.245000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.281000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.302000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.302000 audit: BPF prog-id=21 op=LOAD Feb 13 09:52:21.303000 audit: BPF prog-id=22 op=LOAD Feb 13 09:52:21.303000 audit: BPF prog-id=23 op=LOAD Feb 13 09:52:21.303000 audit: BPF prog-id=19 op=UNLOAD Feb 13 09:52:21.303000 audit: BPF prog-id=20 op=UNLOAD Feb 13 09:52:21.327000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Feb 13 09:52:21.327000 audit[1256]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=6 a1=7fffd3bd0ee0 a2=4000 a3=7fffd3bd0f7c items=0 ppid=1 pid=1256 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:52:21.327000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Feb 13 09:52:19.989599 systemd[1]: Queued start job for default target multi-user.target. Feb 13 09:52:18.426771 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 09:52:19.993271 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 09:52:18.427253 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="profile found" name=docker-1.12-no path=/usr/share/torcx/profiles/docker-1.12-no.json Feb 13 09:52:18.427269 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="profile found" name=vendor path=/usr/share/torcx/profiles/vendor.json Feb 13 09:52:18.427291 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=info msg="no vendor profile selected by /etc/flatcar/docker-1.12" Feb 13 09:52:18.427299 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="skipped missing lower profile" missing profile=oem Feb 13 09:52:18.427320 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=warning msg="no next profile: unable to read profile file: open /etc/torcx/next-profile: no such file or directory" Feb 13 09:52:18.427329 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="apply configuration parsed" lower profiles (vendor/oem)="[vendor]" upper profile (user)= Feb 13 09:52:18.427473 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="mounted tmpfs" target=/run/torcx/unpack Feb 13 09:52:18.427504 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="profile found" name=docker-1.12-no path=/usr/share/torcx/profiles/docker-1.12-no.json Feb 13 09:52:18.427514 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="profile found" name=vendor path=/usr/share/torcx/profiles/vendor.json Feb 13 09:52:18.427941 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="new archive/reference added to cache" format=tgz name=docker path="/usr/share/torcx/store/docker:20.10.torcx.tgz" reference=20.10 Feb 13 09:52:18.427968 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=debug msg="new archive/reference added to cache" format=tgz name=docker path="/usr/share/torcx/store/docker:com.coreos.cl.torcx.tgz" reference=com.coreos.cl Feb 13 09:52:18.427982 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=info msg="store skipped" err="open /usr/share/oem/torcx/store/3510.3.2: no such file or directory" path=/usr/share/oem/torcx/store/3510.3.2 Feb 13 09:52:18.427992 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=info msg="store skipped" err="open /usr/share/oem/torcx/store: no such file or directory" path=/usr/share/oem/torcx/store Feb 13 09:52:18.428004 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=info msg="store skipped" err="open /var/lib/torcx/store/3510.3.2: no such file or directory" path=/var/lib/torcx/store/3510.3.2 Feb 13 09:52:18.428014 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:18Z" level=info msg="store skipped" err="open /var/lib/torcx/store: no such file or directory" path=/var/lib/torcx/store Feb 13 09:52:19.643107 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:19Z" level=debug msg="image unpacked" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 09:52:19.643243 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:19Z" level=debug msg="binaries propagated" assets="[/bin/containerd /bin/containerd-shim /bin/ctr /bin/docker /bin/docker-containerd /bin/docker-containerd-shim /bin/docker-init /bin/docker-proxy /bin/docker-runc /bin/dockerd /bin/runc /bin/tini]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 09:52:19.643297 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:19Z" level=debug msg="networkd units propagated" assets="[/lib/systemd/network/50-docker.network /lib/systemd/network/90-docker-veth.network]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 09:52:19.643389 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:19Z" level=debug msg="systemd units propagated" assets="[/lib/systemd/system/containerd.service /lib/systemd/system/docker.service /lib/systemd/system/docker.socket /lib/systemd/system/sockets.target.wants /lib/systemd/system/multi-user.target.wants]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 09:52:19.643420 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:19Z" level=debug msg="profile applied" sealed profile=/run/torcx/profile.json upper profile= Feb 13 09:52:19.643457 /usr/lib/systemd/system-generators/torcx-generator[1146]: time="2024-02-13T09:52:19Z" level=debug msg="system state sealed" content="[TORCX_LOWER_PROFILES=\"vendor\" TORCX_UPPER_PROFILE=\"\" TORCX_PROFILE_PATH=\"/run/torcx/profile.json\" TORCX_BINDIR=\"/run/torcx/bin\" TORCX_UNPACKDIR=\"/run/torcx/unpack\"]" path=/run/metadata/torcx Feb 13 09:52:21.361641 systemd[1]: Starting systemd-network-generator.service... Feb 13 09:52:21.383487 systemd[1]: Starting systemd-remount-fs.service... Feb 13 09:52:21.405500 systemd[1]: Starting systemd-udev-trigger.service... Feb 13 09:52:21.437984 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 09:52:21.438005 systemd[1]: Stopped verity-setup.service. Feb 13 09:52:21.443000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.472019 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 09:52:21.472040 kernel: kauditd_printk_skb: 65 callbacks suppressed Feb 13 09:52:21.472050 kernel: audit: type=1131 audit(1707817941.443:124): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.485495 systemd[1]: Started systemd-journald.service. Feb 13 09:52:21.546000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.547986 systemd[1]: Mounted dev-hugepages.mount. Feb 13 09:52:21.590640 kernel: audit: type=1130 audit(1707817941.546:125): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.596702 systemd[1]: Mounted dev-mqueue.mount. Feb 13 09:52:21.603689 systemd[1]: Mounted media.mount. Feb 13 09:52:21.610704 systemd[1]: Mounted sys-kernel-debug.mount. Feb 13 09:52:21.619688 systemd[1]: Mounted sys-kernel-tracing.mount. Feb 13 09:52:21.627673 systemd[1]: Mounted tmp.mount. Feb 13 09:52:21.634753 systemd[1]: Finished flatcar-tmpfiles.service. Feb 13 09:52:21.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.642795 systemd[1]: Finished kmod-static-nodes.service. Feb 13 09:52:21.686592 kernel: audit: type=1130 audit(1707817941.641:126): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.694747 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 09:52:21.694815 systemd[1]: Finished modprobe@configfs.service. Feb 13 09:52:21.740635 kernel: audit: type=1130 audit(1707817941.693:127): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.747000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.748776 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 09:52:21.748852 systemd[1]: Finished modprobe@dm_mod.service. Feb 13 09:52:21.747000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.796504 kernel: audit: type=1130 audit(1707817941.747:128): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.796539 kernel: audit: type=1131 audit(1707817941.747:129): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.850000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.851778 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 09:52:21.851856 systemd[1]: Finished modprobe@drm.service. Feb 13 09:52:21.850000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.901492 kernel: audit: type=1130 audit(1707817941.850:130): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.901511 kernel: audit: type=1131 audit(1707817941.850:131): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.959000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:21.960779 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 09:52:21.960854 systemd[1]: Finished modprobe@efi_pstore.service. Feb 13 09:52:21.959000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.012524 kernel: audit: type=1130 audit(1707817941.959:132): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.012543 kernel: audit: type=1131 audit(1707817941.959:133): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.073000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.073000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.074775 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 09:52:22.074836 systemd[1]: Finished modprobe@fuse.service. Feb 13 09:52:22.083000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.083000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.084826 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 09:52:22.084885 systemd[1]: Finished modprobe@loop.service. Feb 13 09:52:22.092000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.092000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.093789 systemd[1]: Finished systemd-modules-load.service. Feb 13 09:52:22.101000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.102788 systemd[1]: Finished systemd-network-generator.service. Feb 13 09:52:22.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.111763 systemd[1]: Finished systemd-remount-fs.service. Feb 13 09:52:22.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.120768 systemd[1]: Finished systemd-udev-trigger.service. Feb 13 09:52:22.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.129984 systemd[1]: Reached target network-pre.target. Feb 13 09:52:22.140629 systemd[1]: Mounting sys-fs-fuse-connections.mount... Feb 13 09:52:22.150881 systemd[1]: Mounting sys-kernel-config.mount... Feb 13 09:52:22.157687 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 09:52:22.160347 systemd[1]: Starting systemd-hwdb-update.service... Feb 13 09:52:22.167995 systemd[1]: Starting systemd-journal-flush.service... Feb 13 09:52:22.171222 systemd-journald[1256]: Time spent on flushing to /var/log/journal/14537728e64448fc8d2a0e6f5ba05368 is 15.224ms for 1624 entries. Feb 13 09:52:22.171222 systemd-journald[1256]: System Journal (/var/log/journal/14537728e64448fc8d2a0e6f5ba05368) is 8.0M, max 195.6M, 187.6M free. Feb 13 09:52:22.220470 systemd-journald[1256]: Received client request to flush runtime journal. Feb 13 09:52:22.184606 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 09:52:22.185916 systemd[1]: Starting systemd-random-seed.service... Feb 13 09:52:22.203559 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Feb 13 09:52:22.204274 systemd[1]: Starting systemd-sysctl.service... Feb 13 09:52:22.211410 systemd[1]: Starting systemd-sysusers.service... Feb 13 09:52:22.219057 systemd[1]: Starting systemd-udev-settle.service... Feb 13 09:52:22.226586 systemd[1]: Mounted sys-fs-fuse-connections.mount. Feb 13 09:52:22.234626 systemd[1]: Mounted sys-kernel-config.mount. Feb 13 09:52:22.242649 systemd[1]: Finished systemd-journal-flush.service. Feb 13 09:52:22.249000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.250691 systemd[1]: Finished systemd-random-seed.service. Feb 13 09:52:22.257000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.258667 systemd[1]: Finished systemd-sysctl.service. Feb 13 09:52:22.265000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.266673 systemd[1]: Finished systemd-sysusers.service. Feb 13 09:52:22.273000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.275612 systemd[1]: Reached target first-boot-complete.target. Feb 13 09:52:22.283735 udevadm[1272]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 13 09:52:22.463289 systemd[1]: Finished systemd-hwdb-update.service. Feb 13 09:52:22.471000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.471000 audit: BPF prog-id=24 op=LOAD Feb 13 09:52:22.472000 audit: BPF prog-id=25 op=LOAD Feb 13 09:52:22.472000 audit: BPF prog-id=7 op=UNLOAD Feb 13 09:52:22.472000 audit: BPF prog-id=8 op=UNLOAD Feb 13 09:52:22.473782 systemd[1]: Starting systemd-udevd.service... Feb 13 09:52:22.485124 systemd-udevd[1273]: Using default interface naming scheme 'v252'. Feb 13 09:52:22.504096 systemd[1]: Started systemd-udevd.service. Feb 13 09:52:22.512000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.515686 systemd[1]: Condition check resulted in dev-ttyS1.device being skipped. Feb 13 09:52:22.515000 audit: BPF prog-id=26 op=LOAD Feb 13 09:52:22.517132 systemd[1]: Starting systemd-networkd.service... Feb 13 09:52:22.542000 audit: BPF prog-id=27 op=LOAD Feb 13 09:52:22.564255 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Feb 13 09:52:22.564363 kernel: ACPI: button: Sleep Button [SLPB] Feb 13 09:52:22.564377 kernel: BTRFS info: devid 1 device path /dev/disk/by-label/OEM changed to /dev/sdb6 scanned by (udev-worker) (1280) Feb 13 09:52:22.564389 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Feb 13 09:52:22.613000 audit: BPF prog-id=28 op=LOAD Feb 13 09:52:22.614000 audit: BPF prog-id=29 op=LOAD Feb 13 09:52:22.618592 systemd[1]: Starting systemd-userdbd.service... Feb 13 09:52:22.620458 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 09:52:22.620486 kernel: IPMI message handler: version 39.2 Feb 13 09:52:22.620507 kernel: ACPI: button: Power Button [PWRF] Feb 13 09:52:22.558000 audit[1340]: AVC avc: denied { confidentiality } for pid=1340 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 13 09:52:22.699120 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Feb 13 09:52:22.707606 systemd[1]: Started systemd-userdbd.service. Feb 13 09:52:22.736471 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Feb 13 09:52:22.736688 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Feb 13 09:52:22.722000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.558000 audit[1340]: SYSCALL arch=c000003e syscall=175 success=yes exit=0 a0=7f9890e1a010 a1=4d8bc a2=7f9892ac7bc5 a3=5 items=42 ppid=1273 pid=1340 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:52:22.558000 audit: CWD cwd="/" Feb 13 09:52:22.558000 audit: PATH item=0 name=(null) inode=45 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=1 name=(null) inode=27055 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=2 name=(null) inode=27055 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=3 name=(null) inode=27056 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=4 name=(null) inode=27055 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=5 name=(null) inode=27057 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=6 name=(null) inode=27055 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=7 name=(null) inode=27058 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=8 name=(null) inode=27058 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=9 name=(null) inode=27059 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=10 name=(null) inode=27058 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=11 name=(null) inode=27060 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=12 name=(null) inode=27058 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=13 name=(null) inode=27061 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=14 name=(null) inode=27058 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=15 name=(null) inode=27062 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=16 name=(null) inode=27058 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=17 name=(null) inode=27063 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=18 name=(null) inode=27055 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=19 name=(null) inode=27064 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=20 name=(null) inode=27064 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=21 name=(null) inode=27065 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=22 name=(null) inode=27064 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=23 name=(null) inode=27066 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=24 name=(null) inode=27064 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=25 name=(null) inode=27067 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=26 name=(null) inode=27064 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=27 name=(null) inode=27068 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=28 name=(null) inode=27064 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=29 name=(null) inode=27069 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=30 name=(null) inode=27055 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=31 name=(null) inode=27070 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=32 name=(null) inode=27070 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=33 name=(null) inode=27071 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=34 name=(null) inode=27070 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=35 name=(null) inode=27072 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=36 name=(null) inode=27070 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=37 name=(null) inode=27073 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=38 name=(null) inode=27070 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=39 name=(null) inode=27074 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=40 name=(null) inode=27070 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PATH item=41 name=(null) inode=27075 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 09:52:22.558000 audit: PROCTITLE proctitle="(udev-worker)" Feb 13 09:52:22.795500 kernel: ipmi device interface Feb 13 09:52:22.795541 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Feb 13 09:52:22.837849 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Feb 13 09:52:22.860562 kernel: i2c i2c-0: 1/4 memory slots populated (from DMI) Feb 13 09:52:22.905346 kernel: ipmi_si: IPMI System Interface driver Feb 13 09:52:22.905424 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Feb 13 09:52:22.905537 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Feb 13 09:52:22.921780 systemd-networkd[1325]: bond0: netdev ready Feb 13 09:52:22.923765 systemd-networkd[1325]: lo: Link UP Feb 13 09:52:22.923767 systemd-networkd[1325]: lo: Gained carrier Feb 13 09:52:22.924221 systemd-networkd[1325]: Enumeration completed Feb 13 09:52:22.924325 systemd[1]: Started systemd-networkd.service. Feb 13 09:52:22.924516 systemd-networkd[1325]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Feb 13 09:52:22.925290 systemd-networkd[1325]: enp1s0f1np1: Configuring with /etc/systemd/network/10-b8:59:9f:e1:0a:f9.network. Feb 13 09:52:22.942000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:22.951290 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Feb 13 09:52:22.951456 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Feb 13 09:52:22.974482 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Feb 13 09:52:22.974738 kernel: iTCO_vendor_support: vendor-support=0 Feb 13 09:52:23.072699 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Feb 13 09:52:23.072848 kernel: ipmi_si: Adding ACPI-specified kcs state machine Feb 13 09:52:23.073471 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Feb 13 09:52:23.099504 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 09:52:23.137457 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Feb 13 09:52:23.137554 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Feb 13 09:52:23.137569 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Feb 13 09:52:23.171488 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 09:52:23.171515 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Feb 13 09:52:23.215457 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Feb 13 09:52:23.216013 systemd-networkd[1325]: enp1s0f0np0: Configuring with /etc/systemd/network/10-b8:59:9f:e1:0a:f8.network. Feb 13 09:52:23.241456 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): bond0: link becomes ready Feb 13 09:52:23.265459 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Feb 13 09:52:23.336499 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 09:52:23.360457 kernel: intel_rapl_common: Found RAPL domain package Feb 13 09:52:23.399796 kernel: intel_rapl_common: Found RAPL domain core Feb 13 09:52:23.399820 kernel: intel_rapl_common: Found RAPL domain dram Feb 13 09:52:23.446468 kernel: ipmi_ssif: IPMI SSIF Interface driver Feb 13 09:52:24.304502 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 13 09:52:24.330522 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Feb 13 09:52:24.332028 systemd-networkd[1325]: bond0: Link UP Feb 13 09:52:24.332214 systemd-networkd[1325]: enp1s0f1np1: Link UP Feb 13 09:52:24.332321 systemd-networkd[1325]: enp1s0f1np1: Gained carrier Feb 13 09:52:24.333261 systemd-networkd[1325]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-b8:59:9f:e1:0a:f8.network. Feb 13 09:52:24.360842 systemd[1]: Finished systemd-udev-settle.service. Feb 13 09:52:24.373516 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Feb 13 09:52:24.373565 kernel: bond0: active interface up! Feb 13 09:52:24.380000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:24.382232 systemd[1]: Starting lvm2-activation-early.service... Feb 13 09:52:24.398132 lvm[1378]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 09:52:24.414456 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 25000 Mbps full duplex Feb 13 09:52:24.432865 systemd[1]: Finished lvm2-activation-early.service. Feb 13 09:52:24.439000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:24.440580 systemd[1]: Reached target cryptsetup.target. Feb 13 09:52:24.449095 systemd[1]: Starting lvm2-activation.service... Feb 13 09:52:24.451213 lvm[1379]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 09:52:24.480881 systemd[1]: Finished lvm2-activation.service. Feb 13 09:52:24.488000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:24.489579 systemd[1]: Reached target local-fs-pre.target. Feb 13 09:52:24.497568 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 09:52:24.497581 systemd[1]: Reached target local-fs.target. Feb 13 09:52:24.505560 systemd[1]: Reached target machines.target. Feb 13 09:52:24.514116 systemd[1]: Starting ldconfig.service... Feb 13 09:52:24.520888 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Feb 13 09:52:24.520943 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 09:52:24.521555 systemd[1]: Starting systemd-boot-update.service... Feb 13 09:52:24.543503 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.559171 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Feb 13 09:52:24.567512 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.586065 systemd[1]: Starting systemd-machine-id-commit.service... Feb 13 09:52:24.591124 systemd[1]: systemd-sysext.service was skipped because no trigger condition checks were met. Feb 13 09:52:24.591144 systemd[1]: ensure-sysext.service was skipped because no trigger condition checks were met. Feb 13 09:52:24.591493 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.591626 systemd[1]: Starting systemd-tmpfiles-setup.service... Feb 13 09:52:24.591838 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1381 (bootctl) Feb 13 09:52:24.592444 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Feb 13 09:52:24.614458 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.620043 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Feb 13 09:52:24.635000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:24.637456 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.659497 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.680500 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.701455 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.722456 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.743455 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.747068 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.748664 systemd-tmpfiles[1385]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Feb 13 09:52:24.782492 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.802495 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.822480 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.842498 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.862481 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.881480 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.900457 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.919512 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.938558 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.957501 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.976503 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:24.993481 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.011481 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.029493 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.047485 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.065500 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.082480 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.099483 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.116631 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.125272 systemd-tmpfiles[1385]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 09:52:25.131517 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.139608 systemd-tmpfiles[1385]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 09:52:25.147514 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.163409 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 09:52:25.163546 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.163763 systemd[1]: Finished systemd-machine-id-commit.service. Feb 13 09:52:25.162000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:25.178262 systemd-fsck[1389]: fsck.fat 4.2 (2021-01-31) Feb 13 09:52:25.178262 systemd-fsck[1389]: /dev/sdb1: 789 files, 115339/258078 clusters Feb 13 09:52:25.178457 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.189591 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Feb 13 09:52:25.193500 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.208506 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.216000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:25.219125 systemd[1]: Mounting boot.mount... Feb 13 09:52:25.222493 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.236502 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.237421 systemd[1]: Mounted boot.mount. Feb 13 09:52:25.250500 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.264494 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.271975 systemd[1]: Finished systemd-boot-update.service. Feb 13 09:52:25.277501 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.291507 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.299000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:25.302030 systemd[1]: Finished systemd-tmpfiles-setup.service. Feb 13 09:52:25.303486 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.315484 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.319640 ldconfig[1380]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 09:52:25.325000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:25.327634 systemd[1]: Finished ldconfig.service. Feb 13 09:52:25.329487 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.342508 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.350000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ldconfig comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:52:25.353506 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.353682 systemd[1]: Starting audit-rules.service... Feb 13 09:52:25.365518 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.368000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Feb 13 09:52:25.368000 audit[1408]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc467170d0 a2=420 a3=0 items=0 ppid=1393 pid=1408 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:52:25.368000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Feb 13 09:52:25.370578 augenrules[1408]: No rules Feb 13 09:52:25.376381 systemd[1]: Starting clean-ca-certificates.service... Feb 13 09:52:25.377458 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.390514 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.400648 systemd[1]: Starting systemd-journal-catalog-update.service... Feb 13 09:52:25.401494 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.413507 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.426458 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.426708 systemd[1]: Starting systemd-resolved.service... Feb 13 09:52:25.439496 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.451457 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.451698 systemd[1]: Starting systemd-timesyncd.service... Feb 13 09:52:25.464492 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.475644 systemd[1]: Starting systemd-update-utmp.service... Feb 13 09:52:25.476490 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.488513 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.499418 systemd[1]: Finished audit-rules.service. Feb 13 09:52:25.500513 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.513524 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.524230 systemd[1]: Finished clean-ca-certificates.service. Feb 13 09:52:25.525494 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.526417 systemd-networkd[1325]: bond0: Gained carrier Feb 13 09:52:25.526776 systemd-networkd[1325]: enp1s0f0np0: Link UP Feb 13 09:52:25.527122 systemd-networkd[1325]: enp1s0f0np0: Gained carrier Feb 13 09:52:25.544412 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 09:52:25.544480 kernel: bond0: (slave enp1s0f1np1): invalid new link 1 on slave Feb 13 09:52:25.548659 systemd[1]: Finished systemd-journal-catalog-update.service. Feb 13 09:52:25.553676 systemd-networkd[1325]: enp1s0f1np1: Link DOWN Feb 13 09:52:25.553679 systemd-networkd[1325]: enp1s0f1np1: Lost carrier Feb 13 09:52:25.560333 systemd[1]: Starting systemd-update-done.service... Feb 13 09:52:25.563656 systemd-networkd[1325]: bond0: Gained IPv6LL Feb 13 09:52:25.567550 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 09:52:25.567782 systemd[1]: Finished systemd-update-utmp.service. Feb 13 09:52:25.575674 systemd[1]: Finished systemd-update-done.service. Feb 13 09:52:25.585354 systemd[1]: Started systemd-timesyncd.service. Feb 13 09:52:25.587940 systemd-resolved[1415]: Positive Trust Anchors: Feb 13 09:52:25.587946 systemd-resolved[1415]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 09:52:25.587964 systemd-resolved[1415]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 13 09:52:25.591862 systemd-resolved[1415]: Using system hostname 'ci-3510.3.2-a-e401d5bc82'. Feb 13 09:52:25.593617 systemd[1]: Reached target time-set.target. Feb 13 09:52:25.770503 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 09:52:25.770653 kernel: bond0: (slave enp1s0f1np1): link status up again after 200 ms Feb 13 09:52:25.779494 kernel: bond0: (slave enp1s0f1np1): speed changed to 0 on port 1 Feb 13 09:52:25.787460 kernel: bond0: (slave enp1s0f1np1): link status up again after 200 ms Feb 13 09:52:25.789739 systemd-networkd[1325]: enp1s0f1np1: Link UP Feb 13 09:52:25.790264 systemd-networkd[1325]: enp1s0f1np1: Gained carrier Feb 13 09:52:25.790867 systemd[1]: Started systemd-resolved.service. Feb 13 09:52:25.808509 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Feb 13 09:52:25.812590 systemd[1]: Reached target network.target. Feb 13 09:52:25.820563 systemd[1]: Reached target nss-lookup.target. Feb 13 09:52:25.828551 systemd[1]: Reached target sysinit.target. Feb 13 09:52:25.836592 systemd[1]: Started motdgen.path. Feb 13 09:52:25.843561 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Feb 13 09:52:25.853615 systemd[1]: Started logrotate.timer. Feb 13 09:52:25.860571 systemd[1]: Started mdadm.timer. Feb 13 09:52:25.867545 systemd[1]: Started systemd-tmpfiles-clean.timer. Feb 13 09:52:25.875522 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 09:52:25.875554 systemd[1]: Reached target paths.target. Feb 13 09:52:25.882543 systemd[1]: Reached target timers.target. Feb 13 09:52:25.889669 systemd[1]: Listening on dbus.socket. Feb 13 09:52:25.897111 systemd[1]: Starting docker.socket... Feb 13 09:52:25.905029 systemd[1]: Listening on sshd.socket. Feb 13 09:52:25.911625 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 09:52:25.911839 systemd[1]: Listening on docker.socket. Feb 13 09:52:25.918586 systemd[1]: Reached target sockets.target. Feb 13 09:52:25.926533 systemd[1]: Reached target basic.target. Feb 13 09:52:25.933545 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 13 09:52:25.933558 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 13 09:52:25.933989 systemd[1]: Starting containerd.service... Feb 13 09:52:25.940946 systemd[1]: Starting coreos-metadata-sshkeys@core.service... Feb 13 09:52:25.949994 systemd[1]: Starting coreos-metadata.service... Feb 13 09:52:25.957024 systemd[1]: Starting dbus.service... Feb 13 09:52:25.963222 systemd[1]: Starting enable-oem-cloudinit.service... Feb 13 09:52:25.968001 jq[1431]: false Feb 13 09:52:25.970077 coreos-metadata[1424]: Feb 13 09:52:25.970 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 09:52:25.971202 systemd[1]: Starting extend-filesystems.service... Feb 13 09:52:25.976175 dbus-daemon[1430]: [system] SELinux support is enabled Feb 13 09:52:25.977575 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Feb 13 09:52:25.978203 systemd[1]: Starting motdgen.service... Feb 13 09:52:25.978777 extend-filesystems[1433]: Found sda Feb 13 09:52:26.002539 kernel: EXT4-fs (sdb9): resizing filesystem from 553472 to 116605649 blocks Feb 13 09:52:25.986232 systemd[1]: Starting prepare-cni-plugins.service... Feb 13 09:52:26.002622 coreos-metadata[1427]: Feb 13 09:52:25.979 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 09:52:26.002727 extend-filesystems[1433]: Found sdb Feb 13 09:52:26.002727 extend-filesystems[1433]: Found sdb1 Feb 13 09:52:26.002727 extend-filesystems[1433]: Found sdb2 Feb 13 09:52:26.002727 extend-filesystems[1433]: Found sdb3 Feb 13 09:52:26.002727 extend-filesystems[1433]: Found usr Feb 13 09:52:26.002727 extend-filesystems[1433]: Found sdb4 Feb 13 09:52:26.002727 extend-filesystems[1433]: Found sdb6 Feb 13 09:52:26.002727 extend-filesystems[1433]: Found sdb7 Feb 13 09:52:26.002727 extend-filesystems[1433]: Found sdb9 Feb 13 09:52:26.002727 extend-filesystems[1433]: Checking size of /dev/sdb9 Feb 13 09:52:26.002727 extend-filesystems[1433]: Resized partition /dev/sdb9 Feb 13 09:52:26.110660 extend-filesystems[1448]: resize2fs 1.46.5 (30-Dec-2021) Feb 13 09:52:26.010241 systemd[1]: Starting prepare-critools.service... Feb 13 09:52:26.017099 systemd[1]: Starting prepare-helm.service... Feb 13 09:52:26.039945 systemd[1]: Starting ssh-key-proc-cmdline.service... Feb 13 09:52:26.047024 systemd[1]: Starting sshd-keygen.service... Feb 13 09:52:26.066773 systemd[1]: Starting systemd-logind.service... Feb 13 09:52:26.083486 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 09:52:26.083981 systemd[1]: Starting tcsd.service... Feb 13 09:52:26.132046 jq[1464]: true Feb 13 09:52:26.087473 systemd-logind[1461]: Watching system buttons on /dev/input/event3 (Power Button) Feb 13 09:52:26.087482 systemd-logind[1461]: Watching system buttons on /dev/input/event2 (Sleep Button) Feb 13 09:52:26.087491 systemd-logind[1461]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Feb 13 09:52:26.087642 systemd-logind[1461]: New seat seat0. Feb 13 09:52:26.102885 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 09:52:26.103243 systemd[1]: Starting update-engine.service... Feb 13 09:52:26.124030 systemd[1]: Starting update-ssh-keys-after-ignition.service... Feb 13 09:52:26.139805 systemd[1]: Started dbus.service. Feb 13 09:52:26.146611 update_engine[1463]: I0213 09:52:26.146179 1463 main.cc:92] Flatcar Update Engine starting Feb 13 09:52:26.148186 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 09:52:26.148273 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Feb 13 09:52:26.148437 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 09:52:26.148523 systemd[1]: Finished motdgen.service. Feb 13 09:52:26.149473 update_engine[1463]: I0213 09:52:26.149462 1463 update_check_scheduler.cc:74] Next update check in 2m10s Feb 13 09:52:26.156587 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 09:52:26.156670 systemd[1]: Finished ssh-key-proc-cmdline.service. Feb 13 09:52:26.160886 tar[1466]: ./ Feb 13 09:52:26.160886 tar[1466]: ./macvlan Feb 13 09:52:26.167095 jq[1472]: true Feb 13 09:52:26.167847 dbus-daemon[1430]: [system] Successfully activated service 'org.freedesktop.systemd1' Feb 13 09:52:26.168925 tar[1467]: crictl Feb 13 09:52:26.170249 tar[1468]: linux-amd64/helm Feb 13 09:52:26.173607 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Feb 13 09:52:26.173710 systemd[1]: Condition check resulted in tcsd.service being skipped. Feb 13 09:52:26.175308 systemd[1]: Started update-engine.service. Feb 13 09:52:26.177605 env[1473]: time="2024-02-13T09:52:26.177580644Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Feb 13 09:52:26.182893 tar[1466]: ./static Feb 13 09:52:26.186103 env[1473]: time="2024-02-13T09:52:26.186083075Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 09:52:26.186578 systemd[1]: Started systemd-logind.service. Feb 13 09:52:26.186988 env[1473]: time="2024-02-13T09:52:26.186977416Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 09:52:26.187640 env[1473]: time="2024-02-13T09:52:26.187624342Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.148-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 09:52:26.187670 env[1473]: time="2024-02-13T09:52:26.187639585Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 09:52:26.187800 env[1473]: time="2024-02-13T09:52:26.187756630Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 09:52:26.187800 env[1473]: time="2024-02-13T09:52:26.187767241Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 09:52:26.187800 env[1473]: time="2024-02-13T09:52:26.187774348Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Feb 13 09:52:26.187800 env[1473]: time="2024-02-13T09:52:26.187779777Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 09:52:26.189701 env[1473]: time="2024-02-13T09:52:26.189672367Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 09:52:26.189838 env[1473]: time="2024-02-13T09:52:26.189798990Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 09:52:26.189926 env[1473]: time="2024-02-13T09:52:26.189872292Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 09:52:26.189926 env[1473]: time="2024-02-13T09:52:26.189883322Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 09:52:26.191945 env[1473]: time="2024-02-13T09:52:26.191904347Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Feb 13 09:52:26.191945 env[1473]: time="2024-02-13T09:52:26.191916330Z" level=info msg="metadata content store policy set" policy=shared Feb 13 09:52:26.196681 systemd[1]: Started locksmithd.service. Feb 13 09:52:26.201621 bash[1501]: Updated "/home/core/.ssh/authorized_keys" Feb 13 09:52:26.203642 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 09:52:26.203777 systemd[1]: Reached target system-config.target. Feb 13 09:52:26.206102 env[1473]: time="2024-02-13T09:52:26.206085859Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 09:52:26.206137 env[1473]: time="2024-02-13T09:52:26.206107680Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 09:52:26.206137 env[1473]: time="2024-02-13T09:52:26.206116725Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 09:52:26.206182 env[1473]: time="2024-02-13T09:52:26.206137916Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206182 env[1473]: time="2024-02-13T09:52:26.206147290Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206182 env[1473]: time="2024-02-13T09:52:26.206156793Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206182 env[1473]: time="2024-02-13T09:52:26.206164060Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206182 env[1473]: time="2024-02-13T09:52:26.206172264Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206182 env[1473]: time="2024-02-13T09:52:26.206179684Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206269 env[1473]: time="2024-02-13T09:52:26.206187364Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206269 env[1473]: time="2024-02-13T09:52:26.206194447Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206269 env[1473]: time="2024-02-13T09:52:26.206201479Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 09:52:26.206269 env[1473]: time="2024-02-13T09:52:26.206255538Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 09:52:26.206329 env[1473]: time="2024-02-13T09:52:26.206303874Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 09:52:26.206451 env[1473]: time="2024-02-13T09:52:26.206442112Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 09:52:26.206476 env[1473]: time="2024-02-13T09:52:26.206464926Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206476 env[1473]: time="2024-02-13T09:52:26.206473752Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 09:52:26.206513 env[1473]: time="2024-02-13T09:52:26.206501543Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206513 env[1473]: time="2024-02-13T09:52:26.206509055Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206547 env[1473]: time="2024-02-13T09:52:26.206515766Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206547 env[1473]: time="2024-02-13T09:52:26.206522455Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206547 env[1473]: time="2024-02-13T09:52:26.206528819Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206547 env[1473]: time="2024-02-13T09:52:26.206535457Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206547 env[1473]: time="2024-02-13T09:52:26.206541441Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206637 env[1473]: time="2024-02-13T09:52:26.206547719Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206637 env[1473]: time="2024-02-13T09:52:26.206555558Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 09:52:26.206637 env[1473]: time="2024-02-13T09:52:26.206620591Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206637 env[1473]: time="2024-02-13T09:52:26.206629681Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206699 env[1473]: time="2024-02-13T09:52:26.206640168Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206699 env[1473]: time="2024-02-13T09:52:26.206649336Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 09:52:26.206699 env[1473]: time="2024-02-13T09:52:26.206663585Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Feb 13 09:52:26.206699 env[1473]: time="2024-02-13T09:52:26.206675540Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 09:52:26.206699 env[1473]: time="2024-02-13T09:52:26.206689017Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Feb 13 09:52:26.206776 env[1473]: time="2024-02-13T09:52:26.206711441Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 09:52:26.206848 env[1473]: time="2024-02-13T09:52:26.206821996Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.206856594Z" level=info msg="Connect containerd service" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.206875344Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207158472Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207259640Z" level=info msg="Start subscribing containerd event" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207300578Z" level=info msg="Start recovering state" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207316033Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207338512Z" level=info msg="Start event monitor" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207338892Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207347357Z" level=info msg="Start snapshots syncer" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207352419Z" level=info msg="Start cni network conf syncer for default" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207356390Z" level=info msg="Start streaming server" Feb 13 09:52:26.208672 env[1473]: time="2024-02-13T09:52:26.207373283Z" level=info msg="containerd successfully booted in 0.030144s" Feb 13 09:52:26.211581 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 09:52:26.211690 systemd[1]: Reached target user-config.target. Feb 13 09:52:26.214062 tar[1466]: ./vlan Feb 13 09:52:26.221037 systemd[1]: Started containerd.service. Feb 13 09:52:26.227795 systemd[1]: Finished update-ssh-keys-after-ignition.service. Feb 13 09:52:26.236479 tar[1466]: ./portmap Feb 13 09:52:26.256772 tar[1466]: ./host-local Feb 13 09:52:26.261606 locksmithd[1507]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 09:52:26.273927 tar[1466]: ./vrf Feb 13 09:52:26.292152 tar[1466]: ./bridge Feb 13 09:52:26.313963 tar[1466]: ./tuning Feb 13 09:52:26.331345 tar[1466]: ./firewall Feb 13 09:52:26.353872 tar[1466]: ./host-device Feb 13 09:52:26.373550 tar[1466]: ./sbr Feb 13 09:52:26.391531 tar[1466]: ./loopback Feb 13 09:52:26.401799 sshd_keygen[1460]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 09:52:26.408670 tar[1466]: ./dhcp Feb 13 09:52:26.413509 systemd[1]: Finished sshd-keygen.service. Feb 13 09:52:26.421502 systemd[1]: Starting issuegen.service... Feb 13 09:52:26.429349 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 09:52:26.429437 systemd[1]: Finished issuegen.service. Feb 13 09:52:26.430192 tar[1468]: linux-amd64/LICENSE Feb 13 09:52:26.430192 tar[1468]: linux-amd64/README.md Feb 13 09:52:26.437353 systemd[1]: Finished prepare-critools.service. Feb 13 09:52:26.445845 systemd[1]: Finished prepare-helm.service. Feb 13 09:52:26.454414 systemd[1]: Starting systemd-user-sessions.service... Feb 13 09:52:26.459889 tar[1466]: ./ptp Feb 13 09:52:26.462793 systemd[1]: Finished systemd-user-sessions.service. Feb 13 09:52:26.471363 systemd[1]: Started getty@tty1.service. Feb 13 09:52:26.481605 tar[1466]: ./ipvlan Feb 13 09:52:26.484899 systemd[1]: Started serial-getty@ttyS1.service. Feb 13 09:52:26.485486 kernel: EXT4-fs (sdb9): resized filesystem to 116605649 Feb 13 09:52:26.492683 systemd[1]: Reached target getty.target. Feb 13 09:52:26.512533 extend-filesystems[1448]: Filesystem at /dev/sdb9 is mounted on /; on-line resizing required Feb 13 09:52:26.512533 extend-filesystems[1448]: old_desc_blocks = 1, new_desc_blocks = 56 Feb 13 09:52:26.512533 extend-filesystems[1448]: The filesystem on /dev/sdb9 is now 116605649 (4k) blocks long. Feb 13 09:52:26.549568 extend-filesystems[1433]: Resized filesystem in /dev/sdb9 Feb 13 09:52:26.558583 tar[1466]: ./bandwidth Feb 13 09:52:26.512923 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 09:52:26.513007 systemd[1]: Finished extend-filesystems.service. Feb 13 09:52:26.556076 systemd[1]: Finished prepare-cni-plugins.service. Feb 13 09:52:28.973496 kernel: mlx5_core 0000:01:00.0: lag map port 1:1 port 2:2 shared_fdb:0 Feb 13 09:52:31.567499 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:2 port 2:2 Feb 13 09:52:31.574482 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:1 port 2:2 Feb 13 09:52:31.672928 login[1531]: pam_lastlog(login:session): file /var/log/lastlog is locked/write Feb 13 09:52:31.673867 login[1530]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 09:52:31.700390 systemd-logind[1461]: New session 1 of user core. Feb 13 09:52:31.702711 systemd[1]: Created slice user-500.slice. Feb 13 09:52:31.705417 systemd[1]: Starting user-runtime-dir@500.service... Feb 13 09:52:31.716117 systemd[1]: Finished user-runtime-dir@500.service. Feb 13 09:52:31.716934 systemd[1]: Starting user@500.service... Feb 13 09:52:31.718790 (systemd)[1538]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 09:52:31.761982 coreos-metadata[1427]: Feb 13 09:52:31.761 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 13 09:52:31.762190 coreos-metadata[1424]: Feb 13 09:52:31.761 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 13 09:52:31.811913 systemd[1538]: Queued start job for default target default.target. Feb 13 09:52:31.812146 systemd[1538]: Reached target paths.target. Feb 13 09:52:31.812159 systemd[1538]: Reached target sockets.target. Feb 13 09:52:31.812167 systemd[1538]: Reached target timers.target. Feb 13 09:52:31.812174 systemd[1538]: Reached target basic.target. Feb 13 09:52:31.812193 systemd[1538]: Reached target default.target. Feb 13 09:52:31.812207 systemd[1538]: Startup finished in 90ms. Feb 13 09:52:31.812235 systemd[1]: Started user@500.service. Feb 13 09:52:31.812962 systemd[1]: Started session-1.scope. Feb 13 09:54:31.844180 systemd-resolved[1415]: Clock change detected. Flushing caches. Feb 13 09:54:31.844332 systemd-timesyncd[1416]: Contacted time server 23.131.160.7:123 (0.flatcar.pool.ntp.org). Feb 13 09:54:31.844481 systemd-timesyncd[1416]: Initial clock synchronization to Tue 2024-02-13 09:54:31.844024 UTC. Feb 13 09:54:32.576750 login[1531]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 09:54:32.579393 systemd-logind[1461]: New session 2 of user core. Feb 13 09:54:32.579943 systemd[1]: Started session-2.scope. Feb 13 09:54:32.660271 coreos-metadata[1427]: Feb 13 09:54:32.660 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 13 09:54:32.661055 coreos-metadata[1424]: Feb 13 09:54:32.660 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 13 09:54:32.686996 coreos-metadata[1427]: Feb 13 09:54:32.686 INFO Fetch successful Feb 13 09:54:32.687411 coreos-metadata[1424]: Feb 13 09:54:32.687 INFO Fetch successful Feb 13 09:54:32.711138 systemd[1]: Finished coreos-metadata.service. Feb 13 09:54:32.711979 systemd[1]: Started packet-phone-home.service. Feb 13 09:54:32.712102 unknown[1424]: wrote ssh authorized keys file for user: core Feb 13 09:54:32.716959 curl[1560]: % Total % Received % Xferd Average Speed Time Time Time Current Feb 13 09:54:32.717110 curl[1560]: Dload Upload Total Spent Left Speed Feb 13 09:54:32.722301 update-ssh-keys[1561]: Updated "/home/core/.ssh/authorized_keys" Feb 13 09:54:32.722539 systemd[1]: Finished coreos-metadata-sshkeys@core.service. Feb 13 09:54:32.722719 systemd[1]: Reached target multi-user.target. Feb 13 09:54:32.723409 systemd[1]: Starting systemd-update-utmp-runlevel.service... Feb 13 09:54:32.727506 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Feb 13 09:54:32.727596 systemd[1]: Finished systemd-update-utmp-runlevel.service. Feb 13 09:54:32.727762 systemd[1]: Startup finished in 1.902s (kernel) + 18.913s (initrd) + 15.085s (userspace) = 35.902s. Feb 13 09:54:32.906208 curl[1560]: \u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 Feb 13 09:54:32.908665 systemd[1]: packet-phone-home.service: Deactivated successfully. Feb 13 09:54:33.242263 systemd[1]: Created slice system-sshd.slice. Feb 13 09:54:33.242904 systemd[1]: Started sshd@0-139.178.70.43:22-139.178.68.195:33394.service. Feb 13 09:54:33.289488 sshd[1565]: Accepted publickey for core from 139.178.68.195 port 33394 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 09:54:33.290538 sshd[1565]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 09:54:33.294035 systemd-logind[1461]: New session 3 of user core. Feb 13 09:54:33.294893 systemd[1]: Started session-3.scope. Feb 13 09:54:33.351932 systemd[1]: Started sshd@1-139.178.70.43:22-139.178.68.195:33410.service. Feb 13 09:54:33.378515 sshd[1570]: Accepted publickey for core from 139.178.68.195 port 33410 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 09:54:33.379306 sshd[1570]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 09:54:33.381663 systemd-logind[1461]: New session 4 of user core. Feb 13 09:54:33.382114 systemd[1]: Started session-4.scope. Feb 13 09:54:33.443633 sshd[1570]: pam_unix(sshd:session): session closed for user core Feb 13 09:54:33.449772 systemd[1]: sshd@1-139.178.70.43:22-139.178.68.195:33410.service: Deactivated successfully. Feb 13 09:54:33.451396 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 09:54:33.453088 systemd-logind[1461]: Session 4 logged out. Waiting for processes to exit. Feb 13 09:54:33.455621 systemd[1]: Started sshd@2-139.178.70.43:22-139.178.68.195:33414.service. Feb 13 09:54:33.458016 systemd-logind[1461]: Removed session 4. Feb 13 09:54:33.535469 sshd[1576]: Accepted publickey for core from 139.178.68.195 port 33414 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 09:54:33.536744 sshd[1576]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 09:54:33.540285 systemd-logind[1461]: New session 5 of user core. Feb 13 09:54:33.541148 systemd[1]: Started session-5.scope. Feb 13 09:54:33.594714 sshd[1576]: pam_unix(sshd:session): session closed for user core Feb 13 09:54:33.601036 systemd[1]: sshd@2-139.178.70.43:22-139.178.68.195:33414.service: Deactivated successfully. Feb 13 09:54:33.602596 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 09:54:33.604278 systemd-logind[1461]: Session 5 logged out. Waiting for processes to exit. Feb 13 09:54:33.606994 systemd[1]: Started sshd@3-139.178.70.43:22-139.178.68.195:33420.service. Feb 13 09:54:33.609416 systemd-logind[1461]: Removed session 5. Feb 13 09:54:33.687541 sshd[1582]: Accepted publickey for core from 139.178.68.195 port 33420 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 09:54:33.688796 sshd[1582]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 09:54:33.692360 systemd-logind[1461]: New session 6 of user core. Feb 13 09:54:33.693233 systemd[1]: Started session-6.scope. Feb 13 09:54:33.758087 sshd[1582]: pam_unix(sshd:session): session closed for user core Feb 13 09:54:33.764310 systemd[1]: sshd@3-139.178.70.43:22-139.178.68.195:33420.service: Deactivated successfully. Feb 13 09:54:33.765994 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 09:54:33.767604 systemd-logind[1461]: Session 6 logged out. Waiting for processes to exit. Feb 13 09:54:33.770224 systemd[1]: Started sshd@4-139.178.70.43:22-139.178.68.195:33424.service. Feb 13 09:54:33.772683 systemd-logind[1461]: Removed session 6. Feb 13 09:54:33.800551 sshd[1588]: Accepted publickey for core from 139.178.68.195 port 33424 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 09:54:33.801174 sshd[1588]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 09:54:33.803689 systemd-logind[1461]: New session 7 of user core. Feb 13 09:54:33.804126 systemd[1]: Started session-7.scope. Feb 13 09:54:33.883144 sudo[1591]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 09:54:33.883730 sudo[1591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 09:54:33.900772 dbus-daemon[1430]: \xd0}\xa7\xcc\xd6U: received setenforce notice (enforcing=-274894768) Feb 13 09:54:33.905050 sudo[1591]: pam_unix(sudo:session): session closed for user root Feb 13 09:54:33.909601 sshd[1588]: pam_unix(sshd:session): session closed for user core Feb 13 09:54:33.915859 systemd[1]: sshd@4-139.178.70.43:22-139.178.68.195:33424.service: Deactivated successfully. Feb 13 09:54:33.917096 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 09:54:33.917442 systemd-logind[1461]: Session 7 logged out. Waiting for processes to exit. Feb 13 09:54:33.917917 systemd[1]: Started sshd@5-139.178.70.43:22-139.178.68.195:33428.service. Feb 13 09:54:33.918262 systemd-logind[1461]: Removed session 7. Feb 13 09:54:33.975149 sshd[1595]: Accepted publickey for core from 139.178.68.195 port 33428 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 09:54:33.976468 sshd[1595]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 09:54:33.980664 systemd-logind[1461]: New session 8 of user core. Feb 13 09:54:33.981569 systemd[1]: Started session-8.scope. Feb 13 09:54:34.041152 sudo[1599]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 09:54:34.041304 sudo[1599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 09:54:34.042965 sudo[1599]: pam_unix(sudo:session): session closed for user root Feb 13 09:54:34.045322 sudo[1598]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Feb 13 09:54:34.045438 sudo[1598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 09:54:34.050738 systemd[1]: Stopping audit-rules.service... Feb 13 09:54:34.051000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 13 09:54:34.051578 auditctl[1602]: No rules Feb 13 09:54:34.051774 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 09:54:34.051864 systemd[1]: Stopped audit-rules.service. Feb 13 09:54:34.052774 systemd[1]: Starting audit-rules.service... Feb 13 09:54:34.056920 kernel: kauditd_printk_skb: 84 callbacks suppressed Feb 13 09:54:34.056955 kernel: audit: type=1305 audit(1707818074.051:171): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 13 09:54:34.064211 augenrules[1619]: No rules Feb 13 09:54:34.064566 systemd[1]: Finished audit-rules.service. Feb 13 09:54:34.065095 sudo[1598]: pam_unix(sudo:session): session closed for user root Feb 13 09:54:34.066186 sshd[1595]: pam_unix(sshd:session): session closed for user core Feb 13 09:54:34.067931 systemd[1]: sshd@5-139.178.70.43:22-139.178.68.195:33428.service: Deactivated successfully. Feb 13 09:54:34.068309 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 09:54:34.068752 systemd-logind[1461]: Session 8 logged out. Waiting for processes to exit. Feb 13 09:54:34.069388 systemd[1]: Started sshd@6-139.178.70.43:22-139.178.68.195:33436.service. Feb 13 09:54:34.069929 systemd-logind[1461]: Removed session 8. Feb 13 09:54:34.051000 audit[1602]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffefc8a53e0 a2=420 a3=0 items=0 ppid=1 pid=1602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:34.103518 kernel: audit: type=1300 audit(1707818074.051:171): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffefc8a53e0 a2=420 a3=0 items=0 ppid=1 pid=1602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:34.103545 kernel: audit: type=1327 audit(1707818074.051:171): proctitle=2F7362696E2F617564697463746C002D44 Feb 13 09:54:34.051000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Feb 13 09:54:34.112975 kernel: audit: type=1131 audit(1707818074.051:172): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.051000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.133278 sshd[1625]: Accepted publickey for core from 139.178.68.195 port 33436 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 09:54:34.134649 sshd[1625]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 09:54:34.064000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.136878 systemd-logind[1461]: New session 9 of user core. Feb 13 09:54:34.137244 systemd[1]: Started session-9.scope. Feb 13 09:54:34.157866 kernel: audit: type=1130 audit(1707818074.064:173): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.157891 kernel: audit: type=1106 audit(1707818074.064:174): pid=1598 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.064000 audit[1598]: USER_END pid=1598 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.183677 sudo[1628]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 09:54:34.183783 sudo[1628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 09:54:34.064000 audit[1598]: CRED_DISP pid=1598 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.207423 kernel: audit: type=1104 audit(1707818074.064:175): pid=1598 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.207451 kernel: audit: type=1106 audit(1707818074.066:176): pid=1595 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:54:34.066000 audit[1595]: USER_END pid=1595 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:54:34.239635 kernel: audit: type=1104 audit(1707818074.066:177): pid=1595 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:54:34.066000 audit[1595]: CRED_DISP pid=1595 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:54:34.265609 kernel: audit: type=1131 audit(1707818074.067:178): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-139.178.70.43:22-139.178.68.195:33428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.067000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-139.178.70.43:22-139.178.68.195:33428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.069000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-139.178.70.43:22-139.178.68.195:33436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.132000 audit[1625]: USER_ACCT pid=1625 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:54:34.134000 audit[1625]: CRED_ACQ pid=1625 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:54:34.134000 audit[1625]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc59d3c550 a2=3 a3=0 items=0 ppid=1 pid=1625 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:34.134000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 09:54:34.138000 audit[1625]: USER_START pid=1625 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:54:34.139000 audit[1627]: CRED_ACQ pid=1627 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:54:34.183000 audit[1628]: USER_ACCT pid=1628 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.183000 audit[1628]: CRED_REFR pid=1628 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:54:34.184000 audit[1628]: USER_START pid=1628 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:54:38.283503 systemd[1]: Starting systemd-networkd-wait-online.service... Feb 13 09:54:38.287973 systemd[1]: Finished systemd-networkd-wait-online.service. Feb 13 09:54:38.287000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-wait-online comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:38.288234 systemd[1]: Reached target network-online.target. Feb 13 09:54:38.288966 systemd[1]: Starting docker.service... Feb 13 09:54:38.310631 env[1648]: time="2024-02-13T09:54:38.310598301Z" level=info msg="Starting up" Feb 13 09:54:38.311437 env[1648]: time="2024-02-13T09:54:38.311387390Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 13 09:54:38.311437 env[1648]: time="2024-02-13T09:54:38.311399768Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 13 09:54:38.311437 env[1648]: time="2024-02-13T09:54:38.311414668Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 13 09:54:38.311437 env[1648]: time="2024-02-13T09:54:38.311422097Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 13 09:54:38.313253 env[1648]: time="2024-02-13T09:54:38.313238678Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 13 09:54:38.313253 env[1648]: time="2024-02-13T09:54:38.313250023Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 13 09:54:38.313331 env[1648]: time="2024-02-13T09:54:38.313261311Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 13 09:54:38.313331 env[1648]: time="2024-02-13T09:54:38.313267742Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 13 09:54:38.330226 env[1648]: time="2024-02-13T09:54:38.330166978Z" level=info msg="Loading containers: start." Feb 13 09:54:38.359000 audit[1693]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1693 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.359000 audit[1693]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7fff12ca2660 a2=0 a3=7fff12ca264c items=0 ppid=1648 pid=1693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.359000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Feb 13 09:54:38.360000 audit[1695]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1695 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.360000 audit[1695]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffdaa2c9680 a2=0 a3=7ffdaa2c966c items=0 ppid=1648 pid=1695 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.360000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Feb 13 09:54:38.361000 audit[1697]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1697 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.361000 audit[1697]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffed551f360 a2=0 a3=7ffed551f34c items=0 ppid=1648 pid=1697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.361000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 09:54:38.362000 audit[1699]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1699 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.362000 audit[1699]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff2f991530 a2=0 a3=7fff2f99151c items=0 ppid=1648 pid=1699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.362000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 13 09:54:38.364000 audit[1701]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_rule pid=1701 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.364000 audit[1701]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd6fef1ea0 a2=0 a3=7ffd6fef1e8c items=0 ppid=1648 pid=1701 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.364000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Feb 13 09:54:38.399000 audit[1706]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_rule pid=1706 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.399000 audit[1706]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffeb863f0b0 a2=0 a3=7ffeb863f09c items=0 ppid=1648 pid=1706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.399000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Feb 13 09:54:38.407000 audit[1708]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1708 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.407000 audit[1708]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc1a30bfd0 a2=0 a3=7ffc1a30bfbc items=0 ppid=1648 pid=1708 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.407000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Feb 13 09:54:38.408000 audit[1710]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=1710 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.408000 audit[1710]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fff269c7ac0 a2=0 a3=7fff269c7aac items=0 ppid=1648 pid=1710 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.408000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Feb 13 09:54:38.411000 audit[1712]: NETFILTER_CFG table=filter:10 family=2 entries=2 op=nft_register_chain pid=1712 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.411000 audit[1712]: SYSCALL arch=c000003e syscall=46 success=yes exit=308 a0=3 a1=7ffd5c96ddb0 a2=0 a3=7ffd5c96dd9c items=0 ppid=1648 pid=1712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.411000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 09:54:38.417000 audit[1716]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_unregister_rule pid=1716 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.417000 audit[1716]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffd50ea7fa0 a2=0 a3=7ffd50ea7f8c items=0 ppid=1648 pid=1716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.417000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 13 09:54:38.419000 audit[1717]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1717 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.419000 audit[1717]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffc301e9b50 a2=0 a3=7ffc301e9b3c items=0 ppid=1648 pid=1717 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.419000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 09:54:38.439446 kernel: Initializing XFRM netlink socket Feb 13 09:54:38.498509 env[1648]: time="2024-02-13T09:54:38.498488554Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Feb 13 09:54:38.511000 audit[1725]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=1725 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.511000 audit[1725]: SYSCALL arch=c000003e syscall=46 success=yes exit=492 a0=3 a1=7ffff81dd1b0 a2=0 a3=7ffff81dd19c items=0 ppid=1648 pid=1725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.511000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Feb 13 09:54:38.525000 audit[1728]: NETFILTER_CFG table=nat:14 family=2 entries=1 op=nft_register_rule pid=1728 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.525000 audit[1728]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffdccc04cd0 a2=0 a3=7ffdccc04cbc items=0 ppid=1648 pid=1728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.525000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Feb 13 09:54:38.527000 audit[1731]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=1731 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.527000 audit[1731]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffcff9aed30 a2=0 a3=7ffcff9aed1c items=0 ppid=1648 pid=1731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.527000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Feb 13 09:54:38.528000 audit[1733]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=1733 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.528000 audit[1733]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffd76649db0 a2=0 a3=7ffd76649d9c items=0 ppid=1648 pid=1733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.528000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Feb 13 09:54:38.529000 audit[1735]: NETFILTER_CFG table=nat:17 family=2 entries=2 op=nft_register_chain pid=1735 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.529000 audit[1735]: SYSCALL arch=c000003e syscall=46 success=yes exit=356 a0=3 a1=7ffe64c7ece0 a2=0 a3=7ffe64c7eccc items=0 ppid=1648 pid=1735 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.529000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Feb 13 09:54:38.530000 audit[1737]: NETFILTER_CFG table=nat:18 family=2 entries=2 op=nft_register_chain pid=1737 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.530000 audit[1737]: SYSCALL arch=c000003e syscall=46 success=yes exit=444 a0=3 a1=7ffe3438f580 a2=0 a3=7ffe3438f56c items=0 ppid=1648 pid=1737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.530000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Feb 13 09:54:38.531000 audit[1739]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=1739 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.531000 audit[1739]: SYSCALL arch=c000003e syscall=46 success=yes exit=304 a0=3 a1=7fff6dd09c60 a2=0 a3=7fff6dd09c4c items=0 ppid=1648 pid=1739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.531000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Feb 13 09:54:38.537000 audit[1742]: NETFILTER_CFG table=filter:20 family=2 entries=1 op=nft_register_rule pid=1742 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.537000 audit[1742]: SYSCALL arch=c000003e syscall=46 success=yes exit=508 a0=3 a1=7ffdb0f4a6d0 a2=0 a3=7ffdb0f4a6bc items=0 ppid=1648 pid=1742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.537000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Feb 13 09:54:38.538000 audit[1744]: NETFILTER_CFG table=filter:21 family=2 entries=1 op=nft_register_rule pid=1744 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.538000 audit[1744]: SYSCALL arch=c000003e syscall=46 success=yes exit=240 a0=3 a1=7ffdcd1dbcf0 a2=0 a3=7ffdcd1dbcdc items=0 ppid=1648 pid=1744 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.538000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 09:54:38.540000 audit[1746]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=1746 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.540000 audit[1746]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7fff65ed8d80 a2=0 a3=7fff65ed8d6c items=0 ppid=1648 pid=1746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.540000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 13 09:54:38.541000 audit[1748]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=1748 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.541000 audit[1748]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc8983f8d0 a2=0 a3=7ffc8983f8bc items=0 ppid=1648 pid=1748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.541000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Feb 13 09:54:38.542045 systemd-networkd[1325]: docker0: Link UP Feb 13 09:54:38.545000 audit[1752]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=1752 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.545000 audit[1752]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffc7730d890 a2=0 a3=7ffc7730d87c items=0 ppid=1648 pid=1752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.545000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 13 09:54:38.546000 audit[1753]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=1753 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:38.546000 audit[1753]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffe76f7fb80 a2=0 a3=7ffe76f7fb6c items=0 ppid=1648 pid=1753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:38.546000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 09:54:38.547117 env[1648]: time="2024-02-13T09:54:38.547081173Z" level=info msg="Loading containers: done." Feb 13 09:54:38.555856 env[1648]: time="2024-02-13T09:54:38.555806161Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 09:54:38.555976 env[1648]: time="2024-02-13T09:54:38.555936911Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Feb 13 09:54:38.556021 env[1648]: time="2024-02-13T09:54:38.556006468Z" level=info msg="Daemon has completed initialization" Feb 13 09:54:38.565442 systemd[1]: Started docker.service. Feb 13 09:54:38.565000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:38.571923 env[1648]: time="2024-02-13T09:54:38.571845767Z" level=info msg="API listen on /run/docker.sock" Feb 13 09:54:38.602968 systemd[1]: Reloading. Feb 13 09:54:38.664324 /usr/lib/systemd/system-generators/torcx-generator[1806]: time="2024-02-13T09:54:38Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 09:54:38.664351 /usr/lib/systemd/system-generators/torcx-generator[1806]: time="2024-02-13T09:54:38Z" level=info msg="torcx already run" Feb 13 09:54:38.716313 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 09:54:38.716322 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 09:54:38.729744 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.775000 audit: BPF prog-id=37 op=LOAD Feb 13 09:54:38.775000 audit: BPF prog-id=30 op=UNLOAD Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit: BPF prog-id=38 op=LOAD Feb 13 09:54:38.776000 audit: BPF prog-id=26 op=UNLOAD Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.776000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit: BPF prog-id=39 op=LOAD Feb 13 09:54:38.777000 audit: BPF prog-id=35 op=UNLOAD Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit: BPF prog-id=40 op=LOAD Feb 13 09:54:38.777000 audit: BPF prog-id=21 op=UNLOAD Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit: BPF prog-id=41 op=LOAD Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.777000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit: BPF prog-id=42 op=LOAD Feb 13 09:54:38.778000 audit: BPF prog-id=22 op=UNLOAD Feb 13 09:54:38.778000 audit: BPF prog-id=23 op=UNLOAD Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit: BPF prog-id=43 op=LOAD Feb 13 09:54:38.778000 audit: BPF prog-id=32 op=UNLOAD Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.778000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit: BPF prog-id=44 op=LOAD Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit: BPF prog-id=45 op=LOAD Feb 13 09:54:38.779000 audit: BPF prog-id=33 op=UNLOAD Feb 13 09:54:38.779000 audit: BPF prog-id=34 op=UNLOAD Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit: BPF prog-id=46 op=LOAD Feb 13 09:54:38.779000 audit: BPF prog-id=27 op=UNLOAD Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit: BPF prog-id=47 op=LOAD Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.779000 audit: BPF prog-id=48 op=LOAD Feb 13 09:54:38.780000 audit: BPF prog-id=28 op=UNLOAD Feb 13 09:54:38.780000 audit: BPF prog-id=29 op=UNLOAD Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.780000 audit: BPF prog-id=49 op=LOAD Feb 13 09:54:38.780000 audit: BPF prog-id=31 op=UNLOAD Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit: BPF prog-id=50 op=LOAD Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:38.781000 audit: BPF prog-id=51 op=LOAD Feb 13 09:54:38.781000 audit: BPF prog-id=24 op=UNLOAD Feb 13 09:54:38.781000 audit: BPF prog-id=25 op=UNLOAD Feb 13 09:54:38.785858 systemd[1]: Started kubelet.service. Feb 13 09:54:38.785000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:38.810014 kubelet[1862]: E0213 09:54:38.809910 1862 run.go:74] "command failed" err="failed to validate kubelet flags: the container runtime endpoint address was not specified or empty, use --container-runtime-endpoint to set" Feb 13 09:54:38.811309 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 09:54:38.811397 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 09:54:38.811000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 09:54:39.440399 env[1473]: time="2024-02-13T09:54:39.440251993Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.26.13\"" Feb 13 09:54:40.101330 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4160795062.mount: Deactivated successfully. Feb 13 09:54:41.358757 env[1473]: time="2024-02-13T09:54:41.358701177Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:41.359674 env[1473]: time="2024-02-13T09:54:41.359631059Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:84900298406b2df97ade16b73c49c2b73265ded8735ac19a4e20c2a4ad65853f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:41.360728 env[1473]: time="2024-02-13T09:54:41.360694721Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:41.361637 env[1473]: time="2024-02-13T09:54:41.361600603Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:2f28bed4096abd572a56595ac0304238bdc271dcfe22c650707c09bf97ec16fd,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:41.362094 env[1473]: time="2024-02-13T09:54:41.362058116Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.26.13\" returns image reference \"sha256:84900298406b2df97ade16b73c49c2b73265ded8735ac19a4e20c2a4ad65853f\"" Feb 13 09:54:41.368180 env[1473]: time="2024-02-13T09:54:41.368151005Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.26.13\"" Feb 13 09:54:42.980315 env[1473]: time="2024-02-13T09:54:42.980244242Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:42.980890 env[1473]: time="2024-02-13T09:54:42.980877742Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:921f237b560bdb02300f82d3606635d395b20635512fab10f0191cff42079486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:42.981980 env[1473]: time="2024-02-13T09:54:42.981953114Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:42.983589 env[1473]: time="2024-02-13T09:54:42.983570557Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:fda420c6c15cdd01c4eba3404f0662fe486a9c7f38fa13c741a21334673841a2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:42.984390 env[1473]: time="2024-02-13T09:54:42.984331514Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.26.13\" returns image reference \"sha256:921f237b560bdb02300f82d3606635d395b20635512fab10f0191cff42079486\"" Feb 13 09:54:42.990553 env[1473]: time="2024-02-13T09:54:42.990486025Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.26.13\"" Feb 13 09:54:44.027101 env[1473]: time="2024-02-13T09:54:44.027069968Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:44.027720 env[1473]: time="2024-02-13T09:54:44.027706927Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4fe82b56f06250b6b7eb3d5a879cd2cfabf41cb3e45b24af6059eadbc3b8026e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:44.028656 env[1473]: time="2024-02-13T09:54:44.028645082Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:44.029464 env[1473]: time="2024-02-13T09:54:44.029453681Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:c3c7303ee6d01c8e5a769db28661cf854b55175aa72c67e9b6a7b9d47ac42af3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:44.029845 env[1473]: time="2024-02-13T09:54:44.029833562Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.26.13\" returns image reference \"sha256:4fe82b56f06250b6b7eb3d5a879cd2cfabf41cb3e45b24af6059eadbc3b8026e\"" Feb 13 09:54:44.039016 env[1473]: time="2024-02-13T09:54:44.038953271Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.26.13\"" Feb 13 09:54:44.917039 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3416284424.mount: Deactivated successfully. Feb 13 09:54:45.205641 env[1473]: time="2024-02-13T09:54:45.205558460Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:45.206167 env[1473]: time="2024-02-13T09:54:45.206140409Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5a7325fa2b6e8d712e4a770abb4a5a5852e87b6de8df34552d67853e9bfb9f9f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:45.206929 env[1473]: time="2024-02-13T09:54:45.206892231Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:45.207625 env[1473]: time="2024-02-13T09:54:45.207586382Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:f6e0de32a002b910b9b2e0e8d769e2d7b05208240559c745ce4781082ab15f22,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:45.207921 env[1473]: time="2024-02-13T09:54:45.207874749Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.26.13\" returns image reference \"sha256:5a7325fa2b6e8d712e4a770abb4a5a5852e87b6de8df34552d67853e9bfb9f9f\"" Feb 13 09:54:45.213859 env[1473]: time="2024-02-13T09:54:45.213803869Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Feb 13 09:54:45.745560 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1660871645.mount: Deactivated successfully. Feb 13 09:54:45.747109 env[1473]: time="2024-02-13T09:54:45.747057382Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:45.747749 env[1473]: time="2024-02-13T09:54:45.747736325Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:45.748401 env[1473]: time="2024-02-13T09:54:45.748363369Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:45.749415 env[1473]: time="2024-02-13T09:54:45.749360983Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:45.749653 env[1473]: time="2024-02-13T09:54:45.749639012Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Feb 13 09:54:45.756092 env[1473]: time="2024-02-13T09:54:45.756075099Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.6-0\"" Feb 13 09:54:46.403005 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1217471010.mount: Deactivated successfully. Feb 13 09:54:49.061924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 09:54:49.061000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:49.062066 systemd[1]: Stopped kubelet.service. Feb 13 09:54:49.062939 systemd[1]: Started kubelet.service. Feb 13 09:54:49.067676 kernel: kauditd_printk_skb: 259 callbacks suppressed Feb 13 09:54:49.067717 kernel: audit: type=1130 audit(1707818089.061:388): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:49.087342 kubelet[1949]: E0213 09:54:49.087282 1949 run.go:74] "command failed" err="failed to validate kubelet flags: the container runtime endpoint address was not specified or empty, use --container-runtime-endpoint to set" Feb 13 09:54:49.089485 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 09:54:49.089553 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 09:54:49.061000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:49.128302 kernel: audit: type=1131 audit(1707818089.061:389): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:49.128340 kernel: audit: type=1130 audit(1707818089.062:390): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:49.062000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:49.184530 kernel: audit: type=1131 audit(1707818089.089:391): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 09:54:49.089000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 09:54:49.331198 env[1473]: time="2024-02-13T09:54:49.331106638Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.6-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:49.331660 env[1473]: time="2024-02-13T09:54:49.331617421Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:fce326961ae2d51a5f726883fd59d2a8c2ccc3e45d3bb859882db58e422e59e7,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:49.332835 env[1473]: time="2024-02-13T09:54:49.332794029Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.6-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:49.333460 env[1473]: time="2024-02-13T09:54:49.333419842Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:dd75ec974b0a2a6f6bb47001ba09207976e625db898d1b16735528c009cb171c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:49.333818 env[1473]: time="2024-02-13T09:54:49.333776623Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.6-0\" returns image reference \"sha256:fce326961ae2d51a5f726883fd59d2a8c2ccc3e45d3bb859882db58e422e59e7\"" Feb 13 09:54:49.338877 env[1473]: time="2024-02-13T09:54:49.338830967Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.9.3\"" Feb 13 09:54:49.913849 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3841833866.mount: Deactivated successfully. Feb 13 09:54:50.289875 env[1473]: time="2024-02-13T09:54:50.289753396Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.9.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:50.290516 env[1473]: time="2024-02-13T09:54:50.290475283Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:50.291208 env[1473]: time="2024-02-13T09:54:50.291162053Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.9.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:50.292591 env[1473]: time="2024-02-13T09:54:50.292571002Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:50.293218 env[1473]: time="2024-02-13T09:54:50.293162579Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.9.3\" returns image reference \"sha256:5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a\"" Feb 13 09:54:52.482172 systemd[1]: Stopped kubelet.service. Feb 13 09:54:52.481000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:52.490120 systemd[1]: Reloading. Feb 13 09:54:52.523711 /usr/lib/systemd/system-generators/torcx-generator[2101]: time="2024-02-13T09:54:52Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 09:54:52.523732 /usr/lib/systemd/system-generators/torcx-generator[2101]: time="2024-02-13T09:54:52Z" level=info msg="torcx already run" Feb 13 09:54:52.481000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:52.541408 kernel: audit: type=1130 audit(1707818092.481:392): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:52.541449 kernel: audit: type=1131 audit(1707818092.481:393): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:52.618949 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 09:54:52.618957 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 09:54:52.629755 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.795782 kernel: audit: type=1400 audit(1707818092.675:394): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.795827 kernel: audit: type=1400 audit(1707818092.675:395): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.795842 kernel: audit: type=1400 audit(1707818092.675:396): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.857787 kernel: audit: type=1400 audit(1707818092.675:397): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.795000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.795000 audit: BPF prog-id=52 op=LOAD Feb 13 09:54:52.795000 audit: BPF prog-id=37 op=UNLOAD Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.796000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.920000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.920000 audit: BPF prog-id=53 op=LOAD Feb 13 09:54:52.920000 audit: BPF prog-id=38 op=UNLOAD Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit: BPF prog-id=54 op=LOAD Feb 13 09:54:52.921000 audit: BPF prog-id=39 op=UNLOAD Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.921000 audit: BPF prog-id=55 op=LOAD Feb 13 09:54:52.921000 audit: BPF prog-id=40 op=UNLOAD Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit: BPF prog-id=56 op=LOAD Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit: BPF prog-id=57 op=LOAD Feb 13 09:54:52.922000 audit: BPF prog-id=41 op=UNLOAD Feb 13 09:54:52.922000 audit: BPF prog-id=42 op=UNLOAD Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit: BPF prog-id=58 op=LOAD Feb 13 09:54:52.922000 audit: BPF prog-id=43 op=UNLOAD Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.922000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit: BPF prog-id=59 op=LOAD Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit: BPF prog-id=60 op=LOAD Feb 13 09:54:52.923000 audit: BPF prog-id=44 op=UNLOAD Feb 13 09:54:52.923000 audit: BPF prog-id=45 op=UNLOAD Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit: BPF prog-id=61 op=LOAD Feb 13 09:54:52.923000 audit: BPF prog-id=46 op=UNLOAD Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit: BPF prog-id=62 op=LOAD Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.923000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit: BPF prog-id=63 op=LOAD Feb 13 09:54:52.924000 audit: BPF prog-id=47 op=UNLOAD Feb 13 09:54:52.924000 audit: BPF prog-id=48 op=UNLOAD Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.924000 audit: BPF prog-id=64 op=LOAD Feb 13 09:54:52.924000 audit: BPF prog-id=49 op=UNLOAD Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.925000 audit: BPF prog-id=65 op=LOAD Feb 13 09:54:52.925000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:52.926000 audit: BPF prog-id=66 op=LOAD Feb 13 09:54:52.926000 audit: BPF prog-id=50 op=UNLOAD Feb 13 09:54:52.926000 audit: BPF prog-id=51 op=UNLOAD Feb 13 09:54:52.932894 systemd[1]: Started kubelet.service. Feb 13 09:54:52.932000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:52.957157 kubelet[2161]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 09:54:52.957157 kubelet[2161]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 09:54:52.957404 kubelet[2161]: I0213 09:54:52.957149 2161 server.go:198] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 09:54:52.957946 kubelet[2161]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 09:54:52.957946 kubelet[2161]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 09:54:53.097616 kubelet[2161]: I0213 09:54:53.097567 2161 server.go:412] "Kubelet version" kubeletVersion="v1.26.5" Feb 13 09:54:53.097616 kubelet[2161]: I0213 09:54:53.097592 2161 server.go:414] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 09:54:53.097717 kubelet[2161]: I0213 09:54:53.097683 2161 server.go:836] "Client rotation is on, will bootstrap in background" Feb 13 09:54:53.099379 kubelet[2161]: I0213 09:54:53.099335 2161 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 09:54:53.099769 kubelet[2161]: E0213 09:54:53.099740 2161 certificate_manager.go:471] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.43:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.118061 kubelet[2161]: I0213 09:54:53.118023 2161 server.go:659] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 09:54:53.118177 kubelet[2161]: I0213 09:54:53.118146 2161 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 09:54:53.118200 kubelet[2161]: I0213 09:54:53.118183 2161 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:systemd KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:} {Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:} {Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] ExperimentalTopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 13 09:54:53.118200 kubelet[2161]: I0213 09:54:53.118193 2161 topology_manager.go:134] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 13 09:54:53.118200 kubelet[2161]: I0213 09:54:53.118200 2161 container_manager_linux.go:308] "Creating device plugin manager" Feb 13 09:54:53.118285 kubelet[2161]: I0213 09:54:53.118245 2161 state_mem.go:36] "Initialized new in-memory state store" Feb 13 09:54:53.119957 kubelet[2161]: I0213 09:54:53.119950 2161 kubelet.go:398] "Attempting to sync node with API server" Feb 13 09:54:53.119988 kubelet[2161]: I0213 09:54:53.119960 2161 kubelet.go:286] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 09:54:53.119988 kubelet[2161]: I0213 09:54:53.119971 2161 kubelet.go:297] "Adding apiserver pod source" Feb 13 09:54:53.119988 kubelet[2161]: I0213 09:54:53.119980 2161 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 09:54:53.120269 kubelet[2161]: W0213 09:54:53.120250 2161 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://139.178.70.43:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.120305 kubelet[2161]: E0213 09:54:53.120276 2161 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.43:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.120305 kubelet[2161]: I0213 09:54:53.120277 2161 kuberuntime_manager.go:244] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 13 09:54:53.120305 kubelet[2161]: W0213 09:54:53.120292 2161 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://139.178.70.43:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-e401d5bc82&limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.120412 kubelet[2161]: E0213 09:54:53.120316 2161 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.43:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-e401d5bc82&limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.120452 kubelet[2161]: W0213 09:54:53.120446 2161 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 09:54:53.120674 kubelet[2161]: I0213 09:54:53.120669 2161 server.go:1186] "Started kubelet" Feb 13 09:54:53.120748 kubelet[2161]: I0213 09:54:53.120737 2161 server.go:161] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 09:54:53.121004 kubelet[2161]: E0213 09:54:53.120997 2161 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 13 09:54:53.121032 kubelet[2161]: E0213 09:54:53.120955 2161 event.go:276] Unable to write event: '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381dd711d0a", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 120658698, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 120658698, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'Post "https://139.178.70.43:6443/api/v1/namespaces/default/events": dial tcp 139.178.70.43:6443: connect: connection refused'(may retry after sleeping) Feb 13 09:54:53.121032 kubelet[2161]: E0213 09:54:53.121009 2161 kubelet.go:1386] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 09:54:53.120000 audit[2161]: AVC avc: denied { mac_admin } for pid=2161 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:53.120000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 09:54:53.120000 audit[2161]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000d3c900 a1=c000efe618 a2=c000d3c8d0 a3=25 items=0 ppid=1 pid=2161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.120000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 09:54:53.120000 audit[2161]: AVC avc: denied { mac_admin } for pid=2161 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:53.120000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 09:54:53.120000 audit[2161]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000ee82a0 a1=c000efe630 a2=c000d3c990 a3=25 items=0 ppid=1 pid=2161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.120000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 09:54:53.121606 kubelet[2161]: I0213 09:54:53.121281 2161 kubelet.go:1341] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 13 09:54:53.121606 kubelet[2161]: I0213 09:54:53.121300 2161 kubelet.go:1345] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 13 09:54:53.121606 kubelet[2161]: I0213 09:54:53.121347 2161 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 09:54:53.121606 kubelet[2161]: I0213 09:54:53.121396 2161 volume_manager.go:293] "Starting Kubelet Volume Manager" Feb 13 09:54:53.121606 kubelet[2161]: I0213 09:54:53.121419 2161 server.go:451] "Adding debug handlers to kubelet server" Feb 13 09:54:53.121606 kubelet[2161]: I0213 09:54:53.121423 2161 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 13 09:54:53.121878 kubelet[2161]: W0213 09:54:53.121848 2161 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://139.178.70.43:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.121982 kubelet[2161]: E0213 09:54:53.121955 2161 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.43:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.121982 kubelet[2161]: E0213 09:54:53.121890 2161 controller.go:146] failed to ensure lease exists, will retry in 200ms, error: Get "https://139.178.70.43:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-e401d5bc82?timeout=10s": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.123000 audit[2185]: NETFILTER_CFG table=mangle:26 family=2 entries=2 op=nft_register_chain pid=2185 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.123000 audit[2185]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffce472d080 a2=0 a3=7ffce472d06c items=0 ppid=2161 pid=2185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.123000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 13 09:54:53.123000 audit[2186]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_register_chain pid=2186 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.123000 audit[2186]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc475edc60 a2=0 a3=7ffc475edc4c items=0 ppid=2161 pid=2186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.123000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 13 09:54:53.124000 audit[2188]: NETFILTER_CFG table=filter:28 family=2 entries=2 op=nft_register_chain pid=2188 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.124000 audit[2188]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe2373c350 a2=0 a3=7ffe2373c33c items=0 ppid=2161 pid=2188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.124000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 09:54:53.125000 audit[2190]: NETFILTER_CFG table=filter:29 family=2 entries=2 op=nft_register_chain pid=2190 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.125000 audit[2190]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffce3a243c0 a2=0 a3=7ffce3a243ac items=0 ppid=2161 pid=2190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.125000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 09:54:53.129000 audit[2193]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2193 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.129000 audit[2193]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7fff49394d20 a2=0 a3=7fff49394d0c items=0 ppid=2161 pid=2193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.129000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Feb 13 09:54:53.129000 audit[2194]: NETFILTER_CFG table=nat:31 family=2 entries=1 op=nft_register_chain pid=2194 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.129000 audit[2194]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc7dec4e80 a2=0 a3=7ffc7dec4e6c items=0 ppid=2161 pid=2194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.129000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D44524F50002D74006E6174 Feb 13 09:54:53.132000 audit[2198]: NETFILTER_CFG table=nat:32 family=2 entries=1 op=nft_register_rule pid=2198 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.132000 audit[2198]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffc0d4f3bc0 a2=0 a3=7ffc0d4f3bac items=0 ppid=2161 pid=2198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.132000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D44524F50002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303038303030 Feb 13 09:54:53.139000 audit[2201]: NETFILTER_CFG table=filter:33 family=2 entries=1 op=nft_register_rule pid=2201 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.139000 audit[2201]: SYSCALL arch=c000003e syscall=46 success=yes exit=664 a0=3 a1=7fffc29e91d0 a2=0 a3=7fffc29e91bc items=0 ppid=2161 pid=2201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.139000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206669726577616C6C20666F722064726F7070696E67206D61726B6564207061636B657473002D6D006D61726B Feb 13 09:54:53.140000 audit[2202]: NETFILTER_CFG table=nat:34 family=2 entries=1 op=nft_register_chain pid=2202 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.140000 audit[2202]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd1134dec0 a2=0 a3=7ffd1134deac items=0 ppid=2161 pid=2202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.140000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D4D415351002D74006E6174 Feb 13 09:54:53.140000 audit[2203]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_chain pid=2203 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.140000 audit[2203]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe2cfe2890 a2=0 a3=7ffe2cfe287c items=0 ppid=2161 pid=2203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.140000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 13 09:54:53.141000 audit[2205]: NETFILTER_CFG table=nat:36 family=2 entries=1 op=nft_register_rule pid=2205 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.141000 audit[2205]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffe04d08240 a2=0 a3=7ffe04d0822c items=0 ppid=2161 pid=2205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.141000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D4D415351002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303034303030 Feb 13 09:54:53.142000 audit[2207]: NETFILTER_CFG table=nat:37 family=2 entries=1 op=nft_register_rule pid=2207 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.142000 audit[2207]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffd9809fbd0 a2=0 a3=7ffd9809fbbc items=0 ppid=2161 pid=2207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.142000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 13 09:54:53.144000 audit[2209]: NETFILTER_CFG table=nat:38 family=2 entries=1 op=nft_register_rule pid=2209 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.144000 audit[2209]: SYSCALL arch=c000003e syscall=46 success=yes exit=364 a0=3 a1=7fff6a7e7ac0 a2=0 a3=7fff6a7e7aac items=0 ppid=2161 pid=2209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.144000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D006D61726B0000002D2D6D61726B00307830303030343030302F30783030303034303030002D6A0052455455524E Feb 13 09:54:53.145000 audit[2211]: NETFILTER_CFG table=nat:39 family=2 entries=1 op=nft_register_rule pid=2211 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.145000 audit[2211]: SYSCALL arch=c000003e syscall=46 success=yes exit=220 a0=3 a1=7ffe7508d650 a2=0 a3=7ffe7508d63c items=0 ppid=2161 pid=2211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.145000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6A004D41524B002D2D786F722D6D61726B0030783030303034303030 Feb 13 09:54:53.146000 audit[2213]: NETFILTER_CFG table=nat:40 family=2 entries=1 op=nft_register_rule pid=2213 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.146000 audit[2213]: SYSCALL arch=c000003e syscall=46 success=yes exit=540 a0=3 a1=7fff7a50cc00 a2=0 a3=7fff7a50cbec items=0 ppid=2161 pid=2213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.146000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732073657276696365207472616666696320726571756972696E6720534E4154002D6A004D415351554552414445 Feb 13 09:54:53.146824 kubelet[2161]: I0213 09:54:53.146810 2161 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 13 09:54:53.146000 audit[2214]: NETFILTER_CFG table=mangle:41 family=10 entries=2 op=nft_register_chain pid=2214 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.146000 audit[2214]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff5293ec40 a2=0 a3=7fff5293ec2c items=0 ppid=2161 pid=2214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.146000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 13 09:54:53.147000 audit[2215]: NETFILTER_CFG table=mangle:42 family=2 entries=1 op=nft_register_chain pid=2215 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.147000 audit[2215]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdf16a3760 a2=0 a3=7ffdf16a374c items=0 ppid=2161 pid=2215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.147000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 13 09:54:53.147000 audit[2216]: NETFILTER_CFG table=nat:43 family=10 entries=2 op=nft_register_chain pid=2216 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.147000 audit[2216]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fff2696c730 a2=0 a3=7fff2696c71c items=0 ppid=2161 pid=2216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.147000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D44524F50002D74006E6174 Feb 13 09:54:53.147000 audit[2217]: NETFILTER_CFG table=nat:44 family=2 entries=1 op=nft_register_chain pid=2217 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.147000 audit[2217]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffccab33680 a2=0 a3=7ffccab3366c items=0 ppid=2161 pid=2217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.147000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 13 09:54:53.147000 audit[2219]: NETFILTER_CFG table=filter:45 family=2 entries=1 op=nft_register_chain pid=2219 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:54:53.147000 audit[2219]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffef36b8850 a2=0 a3=7ffef36b883c items=0 ppid=2161 pid=2219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.147000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 13 09:54:53.148000 audit[2220]: NETFILTER_CFG table=nat:46 family=10 entries=1 op=nft_register_rule pid=2220 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.148000 audit[2220]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7fffcd345db0 a2=0 a3=7fffcd345d9c items=0 ppid=2161 pid=2220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.148000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D44524F50002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303038303030 Feb 13 09:54:53.148000 audit[2221]: NETFILTER_CFG table=filter:47 family=10 entries=2 op=nft_register_chain pid=2221 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.148000 audit[2221]: SYSCALL arch=c000003e syscall=46 success=yes exit=132 a0=3 a1=7ffc57e90210 a2=0 a3=7ffc57e901fc items=0 ppid=2161 pid=2221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.148000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 13 09:54:53.149000 audit[2223]: NETFILTER_CFG table=filter:48 family=10 entries=1 op=nft_register_rule pid=2223 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.149000 audit[2223]: SYSCALL arch=c000003e syscall=46 success=yes exit=664 a0=3 a1=7ffde4579fb0 a2=0 a3=7ffde4579f9c items=0 ppid=2161 pid=2223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.149000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206669726577616C6C20666F722064726F7070696E67206D61726B6564207061636B657473002D6D006D61726B Feb 13 09:54:53.150000 audit[2224]: NETFILTER_CFG table=nat:49 family=10 entries=1 op=nft_register_chain pid=2224 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.150000 audit[2224]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffde48882b0 a2=0 a3=7ffde488829c items=0 ppid=2161 pid=2224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.150000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D4D415351002D74006E6174 Feb 13 09:54:53.150000 audit[2225]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2225 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.150000 audit[2225]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffca8b354c0 a2=0 a3=7ffca8b354ac items=0 ppid=2161 pid=2225 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.150000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 13 09:54:53.151000 audit[2227]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_rule pid=2227 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.151000 audit[2227]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffd5af22190 a2=0 a3=7ffd5af2217c items=0 ppid=2161 pid=2227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.151000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D4D415351002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303034303030 Feb 13 09:54:53.153000 audit[2229]: NETFILTER_CFG table=nat:52 family=10 entries=2 op=nft_register_chain pid=2229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.153000 audit[2229]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffdb270e3e0 a2=0 a3=7ffdb270e3cc items=0 ppid=2161 pid=2229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.153000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 13 09:54:53.154000 audit[2231]: NETFILTER_CFG table=nat:53 family=10 entries=1 op=nft_register_rule pid=2231 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.154000 audit[2231]: SYSCALL arch=c000003e syscall=46 success=yes exit=364 a0=3 a1=7ffd81b29a70 a2=0 a3=7ffd81b29a5c items=0 ppid=2161 pid=2231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.154000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D006D61726B0000002D2D6D61726B00307830303030343030302F30783030303034303030002D6A0052455455524E Feb 13 09:54:53.155000 audit[2233]: NETFILTER_CFG table=nat:54 family=10 entries=1 op=nft_register_rule pid=2233 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.155000 audit[2233]: SYSCALL arch=c000003e syscall=46 success=yes exit=220 a0=3 a1=7ffc9a485160 a2=0 a3=7ffc9a48514c items=0 ppid=2161 pid=2233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.155000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6A004D41524B002D2D786F722D6D61726B0030783030303034303030 Feb 13 09:54:53.156000 audit[2235]: NETFILTER_CFG table=nat:55 family=10 entries=1 op=nft_register_rule pid=2235 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.156000 audit[2235]: SYSCALL arch=c000003e syscall=46 success=yes exit=556 a0=3 a1=7fffbf604200 a2=0 a3=7fffbf6041ec items=0 ppid=2161 pid=2235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.156000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732073657276696365207472616666696320726571756972696E6720534E4154002D6A004D415351554552414445 Feb 13 09:54:53.157511 kubelet[2161]: I0213 09:54:53.157491 2161 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 13 09:54:53.157511 kubelet[2161]: I0213 09:54:53.157502 2161 status_manager.go:176] "Starting to sync pod status with apiserver" Feb 13 09:54:53.157556 kubelet[2161]: I0213 09:54:53.157513 2161 kubelet.go:2113] "Starting kubelet main sync loop" Feb 13 09:54:53.157556 kubelet[2161]: E0213 09:54:53.157539 2161 kubelet.go:2137] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 09:54:53.157850 kubelet[2161]: W0213 09:54:53.157791 2161 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://139.178.70.43:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.157850 kubelet[2161]: E0213 09:54:53.157822 2161 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.43:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.157000 audit[2236]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=2236 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.157000 audit[2236]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcc4194800 a2=0 a3=7ffcc41947ec items=0 ppid=2161 pid=2236 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.157000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 13 09:54:53.158000 audit[2237]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=2237 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.158000 audit[2237]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd5de32020 a2=0 a3=7ffd5de3200c items=0 ppid=2161 pid=2237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.158000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 13 09:54:53.158000 audit[2238]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=2238 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:54:53.158000 audit[2238]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffeaf778070 a2=0 a3=7ffeaf77805c items=0 ppid=2161 pid=2238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.158000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 13 09:54:53.176869 kubelet[2161]: I0213 09:54:53.176826 2161 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 09:54:53.176869 kubelet[2161]: I0213 09:54:53.176837 2161 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 09:54:53.176869 kubelet[2161]: I0213 09:54:53.176848 2161 state_mem.go:36] "Initialized new in-memory state store" Feb 13 09:54:53.177663 kubelet[2161]: I0213 09:54:53.177622 2161 policy_none.go:49] "None policy: Start" Feb 13 09:54:53.178033 kubelet[2161]: I0213 09:54:53.178020 2161 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 13 09:54:53.178097 kubelet[2161]: I0213 09:54:53.178037 2161 state_mem.go:35] "Initializing new in-memory state store" Feb 13 09:54:53.181081 systemd[1]: Created slice kubepods.slice. Feb 13 09:54:53.183959 systemd[1]: Created slice kubepods-burstable.slice. Feb 13 09:54:53.186153 systemd[1]: Created slice kubepods-besteffort.slice. Feb 13 09:54:53.207692 kubelet[2161]: I0213 09:54:53.207649 2161 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 09:54:53.207000 audit[2161]: AVC avc: denied { mac_admin } for pid=2161 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:53.207000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 09:54:53.207000 audit[2161]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000edc510 a1=c0011af1d0 a2=c000edc4e0 a3=25 items=0 ppid=1 pid=2161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:53.207000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 09:54:53.208359 kubelet[2161]: I0213 09:54:53.207738 2161 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 13 09:54:53.208359 kubelet[2161]: I0213 09:54:53.208017 2161 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 09:54:53.208570 kubelet[2161]: E0213 09:54:53.208547 2161 eviction_manager.go:261] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:53.224807 kubelet[2161]: I0213 09:54:53.224772 2161 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.225408 kubelet[2161]: E0213 09:54:53.225348 2161 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://139.178.70.43:6443/api/v1/nodes\": dial tcp 139.178.70.43:6443: connect: connection refused" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.257901 kubelet[2161]: I0213 09:54:53.257803 2161 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:54:53.261155 kubelet[2161]: I0213 09:54:53.261112 2161 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:54:53.264419 kubelet[2161]: I0213 09:54:53.264373 2161 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:54:53.264932 kubelet[2161]: I0213 09:54:53.264894 2161 status_manager.go:698] "Failed to get status for pod" podUID=c9958bab1d18b2706d971dd68861d0c2 pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" err="Get \"https://139.178.70.43:6443/api/v1/namespaces/kube-system/pods/kube-apiserver-ci-3510.3.2-a-e401d5bc82\": dial tcp 139.178.70.43:6443: connect: connection refused" Feb 13 09:54:53.267794 kubelet[2161]: I0213 09:54:53.267750 2161 status_manager.go:698] "Failed to get status for pod" podUID=5f44336ac0c20008fecb50c2ba85b6c8 pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" err="Get \"https://139.178.70.43:6443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ci-3510.3.2-a-e401d5bc82\": dial tcp 139.178.70.43:6443: connect: connection refused" Feb 13 09:54:53.270773 kubelet[2161]: I0213 09:54:53.270726 2161 status_manager.go:698] "Failed to get status for pod" podUID=fa6515c6a6ba63367f92fd7390859950 pod="kube-system/kube-scheduler-ci-3510.3.2-a-e401d5bc82" err="Get \"https://139.178.70.43:6443/api/v1/namespaces/kube-system/pods/kube-scheduler-ci-3510.3.2-a-e401d5bc82\": dial tcp 139.178.70.43:6443: connect: connection refused" Feb 13 09:54:53.275112 systemd[1]: Created slice kubepods-burstable-podc9958bab1d18b2706d971dd68861d0c2.slice. Feb 13 09:54:53.301414 systemd[1]: Created slice kubepods-burstable-pod5f44336ac0c20008fecb50c2ba85b6c8.slice. Feb 13 09:54:53.318735 systemd[1]: Created slice kubepods-burstable-podfa6515c6a6ba63367f92fd7390859950.slice. Feb 13 09:54:53.323927 kubelet[2161]: E0213 09:54:53.322969 2161 controller.go:146] failed to ensure lease exists, will retry in 400ms, error: Get "https://139.178.70.43:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-e401d5bc82?timeout=10s": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.423451 kubelet[2161]: I0213 09:54:53.423334 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c9958bab1d18b2706d971dd68861d0c2-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-e401d5bc82\" (UID: \"c9958bab1d18b2706d971dd68861d0c2\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.423451 kubelet[2161]: I0213 09:54:53.423445 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c9958bab1d18b2706d971dd68861d0c2-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-e401d5bc82\" (UID: \"c9958bab1d18b2706d971dd68861d0c2\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.423796 kubelet[2161]: I0213 09:54:53.423600 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.423796 kubelet[2161]: I0213 09:54:53.423763 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.424017 kubelet[2161]: I0213 09:54:53.423885 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fa6515c6a6ba63367f92fd7390859950-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-e401d5bc82\" (UID: \"fa6515c6a6ba63367f92fd7390859950\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.424017 kubelet[2161]: I0213 09:54:53.423978 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c9958bab1d18b2706d971dd68861d0c2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-e401d5bc82\" (UID: \"c9958bab1d18b2706d971dd68861d0c2\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.424179 kubelet[2161]: I0213 09:54:53.424126 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.424261 kubelet[2161]: I0213 09:54:53.424206 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.424388 kubelet[2161]: I0213 09:54:53.424266 2161 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.429830 kubelet[2161]: I0213 09:54:53.429750 2161 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.430423 kubelet[2161]: E0213 09:54:53.430377 2161 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://139.178.70.43:6443/api/v1/nodes\": dial tcp 139.178.70.43:6443: connect: connection refused" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.598027 env[1473]: time="2024-02-13T09:54:53.597903801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-e401d5bc82,Uid:c9958bab1d18b2706d971dd68861d0c2,Namespace:kube-system,Attempt:0,}" Feb 13 09:54:53.615840 env[1473]: time="2024-02-13T09:54:53.615713586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-e401d5bc82,Uid:5f44336ac0c20008fecb50c2ba85b6c8,Namespace:kube-system,Attempt:0,}" Feb 13 09:54:53.625767 env[1473]: time="2024-02-13T09:54:53.625657234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-e401d5bc82,Uid:fa6515c6a6ba63367f92fd7390859950,Namespace:kube-system,Attempt:0,}" Feb 13 09:54:53.724304 kubelet[2161]: E0213 09:54:53.724060 2161 controller.go:146] failed to ensure lease exists, will retry in 800ms, error: Get "https://139.178.70.43:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-e401d5bc82?timeout=10s": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:53.834607 kubelet[2161]: I0213 09:54:53.834530 2161 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:53.835105 kubelet[2161]: E0213 09:54:53.835038 2161 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://139.178.70.43:6443/api/v1/nodes\": dial tcp 139.178.70.43:6443: connect: connection refused" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:54.044779 kubelet[2161]: W0213 09:54:54.044551 2161 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://139.178.70.43:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.044779 kubelet[2161]: E0213 09:54:54.044669 2161 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.43:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.094573 kubelet[2161]: W0213 09:54:54.094429 2161 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://139.178.70.43:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.094573 kubelet[2161]: E0213 09:54:54.094551 2161 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.43:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.122286 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1508274721.mount: Deactivated successfully. Feb 13 09:54:54.123730 env[1473]: time="2024-02-13T09:54:54.123687649Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.124940 env[1473]: time="2024-02-13T09:54:54.124897796Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.125878 env[1473]: time="2024-02-13T09:54:54.125838667Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.126292 env[1473]: time="2024-02-13T09:54:54.126253676Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.126745 env[1473]: time="2024-02-13T09:54:54.126703977Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.127068 env[1473]: time="2024-02-13T09:54:54.127030323Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.128333 env[1473]: time="2024-02-13T09:54:54.128290194Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.130145 env[1473]: time="2024-02-13T09:54:54.130106175Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.130647 env[1473]: time="2024-02-13T09:54:54.130607196Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.131675 env[1473]: time="2024-02-13T09:54:54.131636846Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.132477 env[1473]: time="2024-02-13T09:54:54.132434438Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.132854 env[1473]: time="2024-02-13T09:54:54.132814513Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:54:54.137582 env[1473]: time="2024-02-13T09:54:54.137488568Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 09:54:54.137582 env[1473]: time="2024-02-13T09:54:54.137510522Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 09:54:54.137582 env[1473]: time="2024-02-13T09:54:54.137532149Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 09:54:54.137722 env[1473]: time="2024-02-13T09:54:54.137624363Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/01b1b47b208e2989f1a91ddf666ed51700a41c75da35c650726ce21930af430d pid=2247 runtime=io.containerd.runc.v2 Feb 13 09:54:54.140177 env[1473]: time="2024-02-13T09:54:54.140124083Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 09:54:54.140177 env[1473]: time="2024-02-13T09:54:54.140149438Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 09:54:54.140177 env[1473]: time="2024-02-13T09:54:54.140156878Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 09:54:54.140382 env[1473]: time="2024-02-13T09:54:54.140238967Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/06716c900aee020f38a2486e0e6a0b936adc20eb917375c87006083c415dbbb5 pid=2266 runtime=io.containerd.runc.v2 Feb 13 09:54:54.141220 env[1473]: time="2024-02-13T09:54:54.141188484Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 09:54:54.141220 env[1473]: time="2024-02-13T09:54:54.141203290Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 09:54:54.141220 env[1473]: time="2024-02-13T09:54:54.141215088Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 09:54:54.141328 env[1473]: time="2024-02-13T09:54:54.141287039Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/c7f748b3fac0d29882f69f57ef69a1da540be603d28e1388a7c38de7f1057149 pid=2282 runtime=io.containerd.runc.v2 Feb 13 09:54:54.144192 systemd[1]: Started cri-containerd-01b1b47b208e2989f1a91ddf666ed51700a41c75da35c650726ce21930af430d.scope. Feb 13 09:54:54.147845 systemd[1]: Started cri-containerd-06716c900aee020f38a2486e0e6a0b936adc20eb917375c87006083c415dbbb5.scope. Feb 13 09:54:54.148817 systemd[1]: Started cri-containerd-c7f748b3fac0d29882f69f57ef69a1da540be603d28e1388a7c38de7f1057149.scope. Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.180545 kernel: kauditd_printk_skb: 280 callbacks suppressed Feb 13 09:54:54.180626 kernel: audit: type=1400 audit(1707818094.153:603): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.244342 kernel: audit: type=1400 audit(1707818094.153:604): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369463 kernel: audit: type=1400 audit(1707818094.153:605): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369500 kernel: audit: type=1400 audit(1707818094.153:606): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.380782 kubelet[2161]: W0213 09:54:54.380730 2161 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://139.178.70.43:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.380782 kubelet[2161]: E0213 09:54:54.380761 2161 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.43:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.414032 kubelet[2161]: W0213 09:54:54.413984 2161 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://139.178.70.43:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-e401d5bc82&limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.414032 kubelet[2161]: E0213 09:54:54.414008 2161 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.43:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-e401d5bc82&limit=500&resourceVersion=0": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.433128 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 09:54:54.433162 kernel: audit: type=1400 audit(1707818094.153:607): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.460029 kernel: audit: audit_lost=1 audit_rate_limit=0 audit_backlog_limit=64 Feb 13 09:54:54.460065 kernel: audit: backlog limit exceeded Feb 13 09:54:54.460083 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 09:54:54.460096 kernel: audit: audit_lost=2 audit_rate_limit=0 audit_backlog_limit=64 Feb 13 09:54:54.524700 kubelet[2161]: E0213 09:54:54.524662 2161 controller.go:146] failed to ensure lease exists, will retry in 1.6s, error: Get "https://139.178.70.43:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-e401d5bc82?timeout=10s": dial tcp 139.178.70.43:6443: connect: connection refused Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.153000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit: BPF prog-id=67 op=LOAD Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2247 pid=2261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031623162343762323038653239383966316139316464663636366564 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001976b0 a2=3c a3=8 items=0 ppid=2247 pid=2261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031623162343762323038653239383966316139316464663636366564 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.243000 audit: BPF prog-id=68 op=LOAD Feb 13 09:54:54.243000 audit[2261]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c0002217f0 items=0 ppid=2247 pid=2261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031623162343762323038653239383966316139316464663636366564 Feb 13 09:54:54.369000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit: BPF prog-id=69 op=LOAD Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000145c48 a2=10 a3=1c items=0 ppid=2266 pid=2291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.369000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036373136633930306165653032306633386132343836653065366130 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001456b0 a2=3c a3=c items=0 ppid=2266 pid=2291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.369000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036373136633930306165653032306633386132343836653065366130 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.369000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.459000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.575000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.575000 audit: BPF prog-id=73 op=LOAD Feb 13 09:54:54.369000 audit[2261]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=c000221838 items=0 ppid=2247 pid=2261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.369000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031623162343762323038653239383966316139316464663636366564 Feb 13 09:54:54.369000 audit[2291]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001459d8 a2=78 a3=c000261ce0 items=0 ppid=2266 pid=2291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.575000 audit[2298]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c00021eeb0 items=0 ppid=2282 pid=2298 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.369000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036373136633930306165653032306633386132343836653065366130 Feb 13 09:54:54.575000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337663734386233666163306432393838326636396635376566363961 Feb 13 09:54:54.629000 audit: BPF prog-id=70 op=UNLOAD Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: BPF prog-id=68 op=UNLOAD Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: SYSCALL arch=c000003e syscall=321 success=no exit=-11 a0=5 a1=c000197770 a2=78 a3=c00021eef8 items=0 ppid=2282 pid=2298 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.629000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337663734386233666163306432393838326636396635376566363961 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: BPF prog-id=74 op=LOAD Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000145770 a2=78 a3=c000261d28 items=0 ppid=2266 pid=2291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036373136633930306165653032306633386132343836653065366130 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: BPF prog-id=74 op=UNLOAD Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { perfmon } for pid=2261 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: BPF prog-id=71 op=UNLOAD Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: BPF prog-id=75 op=LOAD Feb 13 09:54:54.629000 audit[2298]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=1 items=0 ppid=2282 pid=2298 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.629000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337663734386233666163306432393838326636396635376566363961 Feb 13 09:54:54.629000 audit: BPF prog-id=75 op=UNLOAD Feb 13 09:54:54.629000 audit: BPF prog-id=73 op=UNLOAD Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { perfmon } for pid=2291 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { perfmon } for pid=2298 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit[2261]: AVC avc: denied { bpf } for pid=2261 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: BPF prog-id=76 op=LOAD Feb 13 09:54:54.629000 audit[2261]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c000221c48 items=0 ppid=2247 pid=2261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.629000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031623162343762323038653239383966316139316464663636366564 Feb 13 09:54:54.629000 audit[2291]: AVC avc: denied { bpf } for pid=2291 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: BPF prog-id=77 op=LOAD Feb 13 09:54:54.629000 audit[2291]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000145c30 a2=78 a3=c0003e2138 items=0 ppid=2266 pid=2291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.629000 audit[2298]: AVC avc: denied { bpf } for pid=2298 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.629000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036373136633930306165653032306633386132343836653065366130 Feb 13 09:54:54.629000 audit: BPF prog-id=78 op=LOAD Feb 13 09:54:54.629000 audit[2298]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c00021f308 items=0 ppid=2282 pid=2298 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.629000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337663734386233666163306432393838326636396635376566363961 Feb 13 09:54:54.636476 kubelet[2161]: I0213 09:54:54.636463 2161 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:54.636660 kubelet[2161]: E0213 09:54:54.636652 2161 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://139.178.70.43:6443/api/v1/nodes\": dial tcp 139.178.70.43:6443: connect: connection refused" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:54.645649 env[1473]: time="2024-02-13T09:54:54.645617679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-e401d5bc82,Uid:5f44336ac0c20008fecb50c2ba85b6c8,Namespace:kube-system,Attempt:0,} returns sandbox id \"c7f748b3fac0d29882f69f57ef69a1da540be603d28e1388a7c38de7f1057149\"" Feb 13 09:54:54.646000 env[1473]: time="2024-02-13T09:54:54.645844266Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-e401d5bc82,Uid:fa6515c6a6ba63367f92fd7390859950,Namespace:kube-system,Attempt:0,} returns sandbox id \"06716c900aee020f38a2486e0e6a0b936adc20eb917375c87006083c415dbbb5\"" Feb 13 09:54:54.646650 env[1473]: time="2024-02-13T09:54:54.646633606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-e401d5bc82,Uid:c9958bab1d18b2706d971dd68861d0c2,Namespace:kube-system,Attempt:0,} returns sandbox id \"01b1b47b208e2989f1a91ddf666ed51700a41c75da35c650726ce21930af430d\"" Feb 13 09:54:54.647715 env[1473]: time="2024-02-13T09:54:54.647702559Z" level=info msg="CreateContainer within sandbox \"06716c900aee020f38a2486e0e6a0b936adc20eb917375c87006083c415dbbb5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 09:54:54.647759 env[1473]: time="2024-02-13T09:54:54.647715601Z" level=info msg="CreateContainer within sandbox \"01b1b47b208e2989f1a91ddf666ed51700a41c75da35c650726ce21930af430d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 09:54:54.647782 env[1473]: time="2024-02-13T09:54:54.647774253Z" level=info msg="CreateContainer within sandbox \"c7f748b3fac0d29882f69f57ef69a1da540be603d28e1388a7c38de7f1057149\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 09:54:54.655400 env[1473]: time="2024-02-13T09:54:54.655381923Z" level=info msg="CreateContainer within sandbox \"c7f748b3fac0d29882f69f57ef69a1da540be603d28e1388a7c38de7f1057149\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7221c0aeecf9e6208bb878cc9149862f581668ecfd14e30970318376c215cd96\"" Feb 13 09:54:54.655675 env[1473]: time="2024-02-13T09:54:54.655629762Z" level=info msg="StartContainer for \"7221c0aeecf9e6208bb878cc9149862f581668ecfd14e30970318376c215cd96\"" Feb 13 09:54:54.657120 env[1473]: time="2024-02-13T09:54:54.657060841Z" level=info msg="CreateContainer within sandbox \"01b1b47b208e2989f1a91ddf666ed51700a41c75da35c650726ce21930af430d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"901005141a7eb44d34fee2a1e60597632761e846336936944aa3f523e237c9c7\"" Feb 13 09:54:54.657496 env[1473]: time="2024-02-13T09:54:54.657478628Z" level=info msg="CreateContainer within sandbox \"06716c900aee020f38a2486e0e6a0b936adc20eb917375c87006083c415dbbb5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"bd5a5d3ba942cd7d127789db072d048104772cf5feabdb65e50a2cb1eec7f1fc\"" Feb 13 09:54:54.657542 env[1473]: time="2024-02-13T09:54:54.657491192Z" level=info msg="StartContainer for \"901005141a7eb44d34fee2a1e60597632761e846336936944aa3f523e237c9c7\"" Feb 13 09:54:54.657634 env[1473]: time="2024-02-13T09:54:54.657617066Z" level=info msg="StartContainer for \"bd5a5d3ba942cd7d127789db072d048104772cf5feabdb65e50a2cb1eec7f1fc\"" Feb 13 09:54:54.664884 systemd[1]: Started cri-containerd-7221c0aeecf9e6208bb878cc9149862f581668ecfd14e30970318376c215cd96.scope. Feb 13 09:54:54.667948 systemd[1]: Started cri-containerd-901005141a7eb44d34fee2a1e60597632761e846336936944aa3f523e237c9c7.scope. Feb 13 09:54:54.668979 systemd[1]: Started cri-containerd-bd5a5d3ba942cd7d127789db072d048104772cf5feabdb65e50a2cb1eec7f1fc.scope. Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit: BPF prog-id=79 op=LOAD Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000145c48 a2=10 a3=1c items=0 ppid=2282 pid=2381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732323163306165656366396536323038626238373863633931343938 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001456b0 a2=3c a3=8 items=0 ppid=2282 pid=2381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732323163306165656366396536323038626238373863633931343938 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit: BPF prog-id=80 op=LOAD Feb 13 09:54:54.674000 audit[2381]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001459d8 a2=78 a3=c000219530 items=0 ppid=2282 pid=2381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732323163306165656366396536323038626238373863633931343938 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit: BPF prog-id=81 op=LOAD Feb 13 09:54:54.674000 audit[2381]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000145770 a2=78 a3=c000219578 items=0 ppid=2282 pid=2381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732323163306165656366396536323038626238373863633931343938 Feb 13 09:54:54.674000 audit: BPF prog-id=81 op=UNLOAD Feb 13 09:54:54.674000 audit: BPF prog-id=80 op=UNLOAD Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { perfmon } for pid=2381 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit[2381]: AVC avc: denied { bpf } for pid=2381 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.674000 audit: BPF prog-id=82 op=LOAD Feb 13 09:54:54.674000 audit[2381]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000145c30 a2=78 a3=c000219988 items=0 ppid=2282 pid=2381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732323163306165656366396536323038626238373863633931343938 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit: BPF prog-id=83 op=LOAD Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000195c48 a2=10 a3=1c items=0 ppid=2266 pid=2388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264356135643362613934326364376431323737383964623037326430 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001956b0 a2=3c a3=8 items=0 ppid=2266 pid=2388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264356135643362613934326364376431323737383964623037326430 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit: BPF prog-id=84 op=LOAD Feb 13 09:54:54.675000 audit[2388]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001959d8 a2=78 a3=c0003009b0 items=0 ppid=2266 pid=2388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264356135643362613934326364376431323737383964623037326430 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit: BPF prog-id=85 op=LOAD Feb 13 09:54:54.675000 audit[2388]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000195770 a2=78 a3=c0003009f8 items=0 ppid=2266 pid=2388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264356135643362613934326364376431323737383964623037326430 Feb 13 09:54:54.675000 audit: BPF prog-id=85 op=UNLOAD Feb 13 09:54:54.675000 audit: BPF prog-id=84 op=UNLOAD Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { perfmon } for pid=2388 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[2388]: AVC avc: denied { bpf } for pid=2388 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit: BPF prog-id=86 op=LOAD Feb 13 09:54:54.675000 audit[2388]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000195c30 a2=78 a3=c000300e08 items=0 ppid=2266 pid=2388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264356135643362613934326364376431323737383964623037326430 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.675000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit: BPF prog-id=87 op=LOAD Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2247 pid=2391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930313030353134316137656234346433346665653261316536303539 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001976b0 a2=3c a3=8 items=0 ppid=2247 pid=2391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930313030353134316137656234346433346665653261316536303539 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit: BPF prog-id=88 op=LOAD Feb 13 09:54:54.676000 audit[2391]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c000261d10 items=0 ppid=2247 pid=2391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930313030353134316137656234346433346665653261316536303539 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit: BPF prog-id=89 op=LOAD Feb 13 09:54:54.676000 audit[2391]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=c000261d58 items=0 ppid=2247 pid=2391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930313030353134316137656234346433346665653261316536303539 Feb 13 09:54:54.676000 audit: BPF prog-id=89 op=UNLOAD Feb 13 09:54:54.676000 audit: BPF prog-id=88 op=UNLOAD Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { perfmon } for pid=2391 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit[2391]: AVC avc: denied { bpf } for pid=2391 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:54.676000 audit: BPF prog-id=90 op=LOAD Feb 13 09:54:54.676000 audit[2391]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c000366168 items=0 ppid=2247 pid=2391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:54.676000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3930313030353134316137656234346433346665653261316536303539 Feb 13 09:54:54.693050 env[1473]: time="2024-02-13T09:54:54.693013098Z" level=info msg="StartContainer for \"bd5a5d3ba942cd7d127789db072d048104772cf5feabdb65e50a2cb1eec7f1fc\" returns successfully" Feb 13 09:54:54.693203 env[1473]: time="2024-02-13T09:54:54.693107126Z" level=info msg="StartContainer for \"7221c0aeecf9e6208bb878cc9149862f581668ecfd14e30970318376c215cd96\" returns successfully" Feb 13 09:54:54.694463 env[1473]: time="2024-02-13T09:54:54.694448682Z" level=info msg="StartContainer for \"901005141a7eb44d34fee2a1e60597632761e846336936944aa3f523e237c9c7\" returns successfully" Feb 13 09:54:55.202000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:54:55.202000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:54:55.202000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=9 a1=c000410540 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:54:55.202000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=8 a1=c0010a4000 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:54:55.202000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:54:55.202000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:54:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:54:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=43 a1=c004a12000 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:54:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:54:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=45 a1=c003482220 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:54:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:54:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:54:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:54:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=43 a1=c0049942a0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:54:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:54:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:54:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=48 a1=c004a12060 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:54:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:54:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:54:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=48 a1=c004608f20 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:54:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:54:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:54:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=52 a1=c004c64900 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:54:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:54:55.499658 kubelet[2161]: E0213 09:54:55.499525 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381dd711d0a", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 120658698, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 120658698, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:55.554006 kubelet[2161]: E0213 09:54:55.553815 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381dd765ed5", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"InvalidDiskCapacity", Message:"invalid capacity 0 on image filesystem", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 121003221, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 121003221, time.Local), Count:1, Type:"Warning", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:55.610156 kubelet[2161]: E0213 09:54:55.609922 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c5064f", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasSufficientMemory", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasSufficientMemory", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176489551, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176489551, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:55.666997 kubelet[2161]: E0213 09:54:55.666791 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c5180b", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasNoDiskPressure", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasNoDiskPressure", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176494091, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176494091, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:55.729708 kubelet[2161]: E0213 09:54:55.729487 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c524b3", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasSufficientPID", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasSufficientPID", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176497331, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176497331, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:55.787797 kubelet[2161]: E0213 09:54:55.787463 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e2bb8198", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeAllocatableEnforced", Message:"Updated Node Allocatable limit across pods", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 209420184, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 209420184, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:55.849714 kubelet[2161]: E0213 09:54:55.849501 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c5064f", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasSufficientMemory", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasSufficientMemory", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176489551, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 224683920, time.Local), Count:2, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:55.911220 kubelet[2161]: E0213 09:54:55.911010 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c5180b", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasNoDiskPressure", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasNoDiskPressure", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176494091, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 224703241, time.Local), Count:2, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:55.973639 kubelet[2161]: E0213 09:54:55.973436 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c524b3", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasSufficientPID", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasSufficientPID", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176497331, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 224712035, time.Local), Count:2, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:56.133490 kubelet[2161]: E0213 09:54:56.133409 2161 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-3510.3.2-a-e401d5bc82\" not found" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:56.157508 kubelet[2161]: E0213 09:54:56.157275 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c5064f", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasSufficientMemory", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasSufficientMemory", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176489551, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 260986730, time.Local), Count:3, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:56.240909 kubelet[2161]: I0213 09:54:56.240861 2161 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:56.529123 kubelet[2161]: I0213 09:54:56.528919 2161 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:56.548929 kubelet[2161]: E0213 09:54:56.548838 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:56.555939 kubelet[2161]: E0213 09:54:56.555750 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c5180b", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasNoDiskPressure", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasNoDiskPressure", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176494091, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 261005656, time.Local), Count:3, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:56.649000 kubelet[2161]: E0213 09:54:56.648950 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:56.749473 kubelet[2161]: E0213 09:54:56.749407 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:56.850234 kubelet[2161]: E0213 09:54:56.850084 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:56.951242 kubelet[2161]: E0213 09:54:56.951193 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:56.956370 kubelet[2161]: E0213 09:54:56.956155 2161 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-e401d5bc82.17b36381e0c524b3", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-e401d5bc82", UID:"ci-3510.3.2-a-e401d5bc82", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"NodeHasSufficientPID", Message:"Node ci-3510.3.2-a-e401d5bc82 status is now: NodeHasSufficientPID", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-e401d5bc82"}, FirstTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 176497331, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 9, 54, 53, 261013737, time.Local), Count:3, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'namespaces "default" not found' (will not retry!) Feb 13 09:54:57.051976 kubelet[2161]: E0213 09:54:57.051887 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:57.152669 kubelet[2161]: E0213 09:54:57.152620 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:57.253474 kubelet[2161]: E0213 09:54:57.253411 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:57.353794 kubelet[2161]: E0213 09:54:57.353733 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:57.454384 kubelet[2161]: E0213 09:54:57.454153 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:57.555175 kubelet[2161]: E0213 09:54:57.555113 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:57.655571 kubelet[2161]: E0213 09:54:57.655507 2161 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-e401d5bc82\" not found" Feb 13 09:54:58.123242 kubelet[2161]: I0213 09:54:58.123179 2161 apiserver.go:52] "Watching apiserver" Feb 13 09:54:58.222064 kubelet[2161]: I0213 09:54:58.221966 2161 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 13 09:54:58.254210 kubelet[2161]: I0213 09:54:58.254153 2161 reconciler.go:41] "Reconciler: start to sync state" Feb 13 09:54:58.580076 systemd[1]: Reloading. Feb 13 09:54:58.619557 /usr/lib/systemd/system-generators/torcx-generator[2533]: time="2024-02-13T09:54:58Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 09:54:58.619574 /usr/lib/systemd/system-generators/torcx-generator[2533]: time="2024-02-13T09:54:58Z" level=info msg="torcx already run" Feb 13 09:54:58.681134 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 09:54:58.681146 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 09:54:58.695502 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.748000 audit: BPF prog-id=91 op=LOAD Feb 13 09:54:58.748000 audit: BPF prog-id=52 op=UNLOAD Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit: BPF prog-id=92 op=LOAD Feb 13 09:54:58.749000 audit: BPF prog-id=53 op=UNLOAD Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.749000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit: BPF prog-id=93 op=LOAD Feb 13 09:54:58.750000 audit: BPF prog-id=54 op=UNLOAD Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit: BPF prog-id=94 op=LOAD Feb 13 09:54:58.750000 audit: BPF prog-id=55 op=UNLOAD Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit: BPF prog-id=95 op=LOAD Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.750000 audit: BPF prog-id=96 op=LOAD Feb 13 09:54:58.750000 audit: BPF prog-id=56 op=UNLOAD Feb 13 09:54:58.750000 audit: BPF prog-id=57 op=UNLOAD Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit: BPF prog-id=97 op=LOAD Feb 13 09:54:58.751000 audit: BPF prog-id=58 op=UNLOAD Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit: BPF prog-id=98 op=LOAD Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.751000 audit: BPF prog-id=99 op=LOAD Feb 13 09:54:58.751000 audit: BPF prog-id=59 op=UNLOAD Feb 13 09:54:58.751000 audit: BPF prog-id=60 op=UNLOAD Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit: BPF prog-id=100 op=LOAD Feb 13 09:54:58.752000 audit: BPF prog-id=67 op=UNLOAD Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit: BPF prog-id=101 op=LOAD Feb 13 09:54:58.752000 audit: BPF prog-id=61 op=UNLOAD Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit: BPF prog-id=102 op=LOAD Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit: BPF prog-id=103 op=LOAD Feb 13 09:54:58.752000 audit: BPF prog-id=62 op=UNLOAD Feb 13 09:54:58.752000 audit: BPF prog-id=63 op=UNLOAD Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.752000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit: BPF prog-id=104 op=LOAD Feb 13 09:54:58.753000 audit: BPF prog-id=72 op=UNLOAD Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit: BPF prog-id=105 op=LOAD Feb 13 09:54:58.753000 audit: BPF prog-id=87 op=UNLOAD Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.753000 audit: BPF prog-id=106 op=LOAD Feb 13 09:54:58.753000 audit: BPF prog-id=69 op=UNLOAD Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit: BPF prog-id=107 op=LOAD Feb 13 09:54:58.754000 audit: BPF prog-id=79 op=UNLOAD Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.754000 audit: BPF prog-id=108 op=LOAD Feb 13 09:54:58.754000 audit: BPF prog-id=64 op=UNLOAD Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.755000 audit: BPF prog-id=109 op=LOAD Feb 13 09:54:58.755000 audit: BPF prog-id=83 op=UNLOAD Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit: BPF prog-id=110 op=LOAD Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.756000 audit: BPF prog-id=111 op=LOAD Feb 13 09:54:58.756000 audit: BPF prog-id=65 op=UNLOAD Feb 13 09:54:58.756000 audit: BPF prog-id=66 op=UNLOAD Feb 13 09:54:58.762683 kubelet[2161]: I0213 09:54:58.762637 2161 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 09:54:58.762652 systemd[1]: Stopping kubelet.service... Feb 13 09:54:58.783679 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 09:54:58.783785 systemd[1]: Stopped kubelet.service. Feb 13 09:54:58.783000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:58.784720 systemd[1]: Started kubelet.service. Feb 13 09:54:58.784000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:54:58.809192 kubelet[2593]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 09:54:58.809192 kubelet[2593]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 09:54:58.809417 kubelet[2593]: I0213 09:54:58.809212 2593 server.go:198] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 09:54:58.809969 kubelet[2593]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 09:54:58.809969 kubelet[2593]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 09:54:58.812193 kubelet[2593]: I0213 09:54:58.812155 2593 server.go:412] "Kubelet version" kubeletVersion="v1.26.5" Feb 13 09:54:58.812193 kubelet[2593]: I0213 09:54:58.812165 2593 server.go:414] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 09:54:58.812287 kubelet[2593]: I0213 09:54:58.812281 2593 server.go:836] "Client rotation is on, will bootstrap in background" Feb 13 09:54:58.813008 kubelet[2593]: I0213 09:54:58.812991 2593 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 09:54:58.813351 kubelet[2593]: I0213 09:54:58.813342 2593 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 09:54:58.830788 kubelet[2593]: I0213 09:54:58.830709 2593 server.go:659] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 09:54:58.830835 kubelet[2593]: I0213 09:54:58.830807 2593 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 09:54:58.830874 kubelet[2593]: I0213 09:54:58.830867 2593 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:systemd KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:} {Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:} {Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:} {Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] ExperimentalTopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 13 09:54:58.830932 kubelet[2593]: I0213 09:54:58.830885 2593 topology_manager.go:134] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 13 09:54:58.830932 kubelet[2593]: I0213 09:54:58.830896 2593 container_manager_linux.go:308] "Creating device plugin manager" Feb 13 09:54:58.830932 kubelet[2593]: I0213 09:54:58.830921 2593 state_mem.go:36] "Initialized new in-memory state store" Feb 13 09:54:58.833173 kubelet[2593]: I0213 09:54:58.833165 2593 kubelet.go:398] "Attempting to sync node with API server" Feb 13 09:54:58.833211 kubelet[2593]: I0213 09:54:58.833177 2593 kubelet.go:286] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 09:54:58.833211 kubelet[2593]: I0213 09:54:58.833196 2593 kubelet.go:297] "Adding apiserver pod source" Feb 13 09:54:58.833260 kubelet[2593]: I0213 09:54:58.833212 2593 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 09:54:58.833547 kubelet[2593]: I0213 09:54:58.833537 2593 kuberuntime_manager.go:244] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 13 09:54:58.833778 kubelet[2593]: I0213 09:54:58.833770 2593 server.go:1186] "Started kubelet" Feb 13 09:54:58.833815 kubelet[2593]: I0213 09:54:58.833800 2593 server.go:161] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 09:54:58.834329 kubelet[2593]: I0213 09:54:58.834321 2593 server.go:451] "Adding debug handlers to kubelet server" Feb 13 09:54:58.834000 audit[2593]: AVC avc: denied { mac_admin } for pid=2593 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.834000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 09:54:58.834000 audit[2593]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000d06660 a1=c000d08498 a2=c000d06630 a3=25 items=0 ppid=1 pid=2593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:58.834000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 09:54:58.834000 audit[2593]: AVC avc: denied { mac_admin } for pid=2593 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.834000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 09:54:58.834000 audit[2593]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000d1a1a0 a1=c000d084b0 a2=c000d066f0 a3=25 items=0 ppid=1 pid=2593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:58.834000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 09:54:58.834726 kubelet[2593]: I0213 09:54:58.834428 2593 kubelet.go:1341] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 13 09:54:58.834726 kubelet[2593]: I0213 09:54:58.834445 2593 kubelet.go:1345] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 13 09:54:58.834726 kubelet[2593]: I0213 09:54:58.834458 2593 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 09:54:58.834726 kubelet[2593]: I0213 09:54:58.834493 2593 volume_manager.go:293] "Starting Kubelet Volume Manager" Feb 13 09:54:58.834726 kubelet[2593]: I0213 09:54:58.834529 2593 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 13 09:54:58.835570 kubelet[2593]: E0213 09:54:58.835555 2593 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 13 09:54:58.835622 kubelet[2593]: E0213 09:54:58.835574 2593 kubelet.go:1386] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 09:54:58.846712 kubelet[2593]: I0213 09:54:58.846698 2593 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 13 09:54:58.852684 kubelet[2593]: I0213 09:54:58.852637 2593 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 13 09:54:58.852684 kubelet[2593]: I0213 09:54:58.852649 2593 status_manager.go:176] "Starting to sync pod status with apiserver" Feb 13 09:54:58.852684 kubelet[2593]: I0213 09:54:58.852662 2593 kubelet.go:2113] "Starting kubelet main sync loop" Feb 13 09:54:58.852684 kubelet[2593]: E0213 09:54:58.852688 2593 kubelet.go:2137] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 09:54:58.855176 kubelet[2593]: I0213 09:54:58.855167 2593 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 09:54:58.855176 kubelet[2593]: I0213 09:54:58.855174 2593 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 09:54:58.855245 kubelet[2593]: I0213 09:54:58.855182 2593 state_mem.go:36] "Initialized new in-memory state store" Feb 13 09:54:58.855268 kubelet[2593]: I0213 09:54:58.855262 2593 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 09:54:58.855287 kubelet[2593]: I0213 09:54:58.855271 2593 state_mem.go:96] "Updated CPUSet assignments" assignments=map[] Feb 13 09:54:58.855287 kubelet[2593]: I0213 09:54:58.855275 2593 policy_none.go:49] "None policy: Start" Feb 13 09:54:58.855539 kubelet[2593]: I0213 09:54:58.855500 2593 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 13 09:54:58.855539 kubelet[2593]: I0213 09:54:58.855511 2593 state_mem.go:35] "Initializing new in-memory state store" Feb 13 09:54:58.855596 kubelet[2593]: I0213 09:54:58.855589 2593 state_mem.go:75] "Updated machine memory state" Feb 13 09:54:58.857285 kubelet[2593]: I0213 09:54:58.857242 2593 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 09:54:58.857285 kubelet[2593]: I0213 09:54:58.857268 2593 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 13 09:54:58.856000 audit[2593]: AVC avc: denied { mac_admin } for pid=2593 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:54:58.856000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 09:54:58.856000 audit[2593]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c001077cb0 a1=c000519e60 a2=c001077c80 a3=25 items=0 ppid=1 pid=2593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:54:58.856000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 09:54:58.857460 kubelet[2593]: I0213 09:54:58.857394 2593 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 09:54:58.943558 kubelet[2593]: I0213 09:54:58.943472 2593 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:58.953075 kubelet[2593]: I0213 09:54:58.953024 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:54:58.953318 kubelet[2593]: I0213 09:54:58.953171 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:54:58.953318 kubelet[2593]: I0213 09:54:58.953277 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:54:58.955999 kubelet[2593]: I0213 09:54:58.955915 2593 kubelet_node_status.go:108] "Node was previously registered" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:58.956172 kubelet[2593]: I0213 09:54:58.956090 2593 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:58.961867 kubelet[2593]: E0213 09:54:58.961809 2593 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-3510.3.2-a-e401d5bc82\" already exists" pod="kube-system/kube-scheduler-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035084 kubelet[2593]: I0213 09:54:59.035038 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fa6515c6a6ba63367f92fd7390859950-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-e401d5bc82\" (UID: \"fa6515c6a6ba63367f92fd7390859950\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035084 kubelet[2593]: I0213 09:54:59.035061 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035084 kubelet[2593]: I0213 09:54:59.035073 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035084 kubelet[2593]: I0213 09:54:59.035085 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035325 kubelet[2593]: I0213 09:54:59.035104 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035325 kubelet[2593]: I0213 09:54:59.035122 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c9958bab1d18b2706d971dd68861d0c2-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-e401d5bc82\" (UID: \"c9958bab1d18b2706d971dd68861d0c2\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035325 kubelet[2593]: I0213 09:54:59.035170 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c9958bab1d18b2706d971dd68861d0c2-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-e401d5bc82\" (UID: \"c9958bab1d18b2706d971dd68861d0c2\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035325 kubelet[2593]: I0213 09:54:59.035206 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c9958bab1d18b2706d971dd68861d0c2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-e401d5bc82\" (UID: \"c9958bab1d18b2706d971dd68861d0c2\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.035325 kubelet[2593]: I0213 09:54:59.035233 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5f44336ac0c20008fecb50c2ba85b6c8-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-e401d5bc82\" (UID: \"5f44336ac0c20008fecb50c2ba85b6c8\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.038225 kubelet[2593]: E0213 09:54:59.038188 2593 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-e401d5bc82\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" Feb 13 09:54:59.833927 kubelet[2593]: I0213 09:54:59.833806 2593 apiserver.go:52] "Watching apiserver" Feb 13 09:55:00.135029 kubelet[2593]: I0213 09:55:00.134936 2593 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 13 09:55:00.141932 kubelet[2593]: I0213 09:55:00.141850 2593 reconciler.go:41] "Reconciler: start to sync state" Feb 13 09:55:00.438434 kubelet[2593]: E0213 09:55:00.438323 2593 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-e401d5bc82\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" Feb 13 09:55:00.641731 kubelet[2593]: E0213 09:55:00.641632 2593 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-3510.3.2-a-e401d5bc82\" already exists" pod="kube-system/kube-scheduler-ci-3510.3.2-a-e401d5bc82" Feb 13 09:55:00.843665 kubelet[2593]: I0213 09:55:00.843575 2593 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-3510.3.2-a-e401d5bc82" podStartSLOduration=2.84353838 pod.CreationTimestamp="2024-02-13 09:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 09:55:00.84326725 +0000 UTC m=+2.056786700" watchObservedRunningTime="2024-02-13 09:55:00.84353838 +0000 UTC m=+2.057057825" Feb 13 09:55:01.637317 kubelet[2593]: I0213 09:55:01.637294 2593 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-3510.3.2-a-e401d5bc82" podStartSLOduration=3.637270797 pod.CreationTimestamp="2024-02-13 09:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 09:55:01.637175982 +0000 UTC m=+2.850695429" watchObservedRunningTime="2024-02-13 09:55:01.637270797 +0000 UTC m=+2.850790243" Feb 13 09:55:03.941987 sudo[1628]: pam_unix(sudo:session): session closed for user root Feb 13 09:55:03.941000 audit[1628]: USER_END pid=1628 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:55:03.942821 sshd[1625]: pam_unix(sshd:session): session closed for user core Feb 13 09:55:03.944327 systemd[1]: sshd@6-139.178.70.43:22-139.178.68.195:33436.service: Deactivated successfully. Feb 13 09:55:03.944734 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 09:55:03.944817 systemd[1]: session-9.scope: Consumed 3.501s CPU time. Feb 13 09:55:03.945131 systemd-logind[1461]: Session 9 logged out. Waiting for processes to exit. Feb 13 09:55:03.945631 systemd-logind[1461]: Removed session 9. Feb 13 09:55:03.968120 kernel: kauditd_printk_skb: 670 callbacks suppressed Feb 13 09:55:03.968191 kernel: audit: type=1106 audit(1707818103.941:957): pid=1628 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:55:03.941000 audit[1628]: CRED_DISP pid=1628 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:55:04.136008 kernel: audit: type=1104 audit(1707818103.941:958): pid=1628 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 09:55:04.136037 kernel: audit: type=1106 audit(1707818103.943:959): pid=1625 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:55:03.943000 audit[1625]: USER_END pid=1625 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:55:04.227636 kernel: audit: type=1104 audit(1707818103.943:960): pid=1625 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:55:03.943000 audit[1625]: CRED_DISP pid=1625 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 09:55:04.314599 kernel: audit: type=1131 audit(1707818103.944:961): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-139.178.70.43:22-139.178.68.195:33436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:55:03.944000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-139.178.70.43:22-139.178.68.195:33436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 09:55:09.607000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:09.607000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0017d6340 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:09.815210 kernel: audit: type=1400 audit(1707818109.607:962): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:09.815246 kernel: audit: type=1300 audit(1707818109.607:962): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0017d6340 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:09.815273 kernel: audit: type=1327 audit(1707818109.607:962): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:09.607000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:09.610000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:09.993671 kernel: audit: type=1400 audit(1707818109.610:963): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:09.993725 kernel: audit: type=1300 audit(1707818109.610:963): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00103b2e0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:09.610000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00103b2e0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:09.610000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:10.202034 kernel: audit: type=1327 audit(1707818109.610:963): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:10.206590 kernel: audit: type=1400 audit(1707818109.613:964): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:09.613000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:10.291570 kernel: audit: type=1300 audit(1707818109.613:964): arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0017d6380 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:09.613000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c0017d6380 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:10.362500 kubelet[2593]: I0213 09:55:10.362476 2593 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-e401d5bc82" podStartSLOduration=11.362443392 pod.CreationTimestamp="2024-02-13 09:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 09:55:02.046351472 +0000 UTC m=+3.259871018" watchObservedRunningTime="2024-02-13 09:55:10.362443392 +0000 UTC m=+11.575962841" Feb 13 09:55:09.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:10.500878 kernel: audit: type=1327 audit(1707818109.613:964): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:10.500933 kernel: audit: type=1400 audit(1707818109.614:965): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:09.614000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:09.614000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00103b4e0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:09.614000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:10.601000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/opt/libexec/kubernetes/kubelet-plugins/volume/exec" dev="sdb9" ino=525104 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:usr_t:s0 tclass=dir permissive=0 Feb 13 09:55:10.601000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c00197e200 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:10.601000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:10.714685 kubelet[2593]: I0213 09:55:10.714659 2593 kuberuntime_manager.go:1114] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 09:55:10.714877 env[1473]: time="2024-02-13T09:55:10.714857002Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 09:55:10.715058 kubelet[2593]: I0213 09:55:10.714956 2593 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 09:55:10.832670 update_engine[1463]: I0213 09:55:10.832501 1463 update_attempter.cc:509] Updating boot flags... Feb 13 09:55:11.545207 kubelet[2593]: I0213 09:55:11.545104 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:55:11.559045 systemd[1]: Created slice kubepods-besteffort-pod0c09d2ba_f193_4428_8eb0_97a7bfd27906.slice. Feb 13 09:55:11.612731 kubelet[2593]: I0213 09:55:11.612671 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0c09d2ba-f193-4428-8eb0-97a7bfd27906-xtables-lock\") pod \"kube-proxy-9g57l\" (UID: \"0c09d2ba-f193-4428-8eb0-97a7bfd27906\") " pod="kube-system/kube-proxy-9g57l" Feb 13 09:55:11.613012 kubelet[2593]: I0213 09:55:11.612765 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkjhv\" (UniqueName: \"kubernetes.io/projected/0c09d2ba-f193-4428-8eb0-97a7bfd27906-kube-api-access-zkjhv\") pod \"kube-proxy-9g57l\" (UID: \"0c09d2ba-f193-4428-8eb0-97a7bfd27906\") " pod="kube-system/kube-proxy-9g57l" Feb 13 09:55:11.613012 kubelet[2593]: I0213 09:55:11.612853 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0c09d2ba-f193-4428-8eb0-97a7bfd27906-kube-proxy\") pod \"kube-proxy-9g57l\" (UID: \"0c09d2ba-f193-4428-8eb0-97a7bfd27906\") " pod="kube-system/kube-proxy-9g57l" Feb 13 09:55:11.613012 kubelet[2593]: I0213 09:55:11.612988 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c09d2ba-f193-4428-8eb0-97a7bfd27906-lib-modules\") pod \"kube-proxy-9g57l\" (UID: \"0c09d2ba-f193-4428-8eb0-97a7bfd27906\") " pod="kube-system/kube-proxy-9g57l" Feb 13 09:55:11.759920 kubelet[2593]: I0213 09:55:11.759862 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:55:11.770518 systemd[1]: Created slice kubepods-besteffort-pod22cb89e0_9104_450f_ac62_af5ee6df0404.slice. Feb 13 09:55:11.814649 kubelet[2593]: I0213 09:55:11.814442 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh4gx\" (UniqueName: \"kubernetes.io/projected/22cb89e0-9104-450f-ac62-af5ee6df0404-kube-api-access-hh4gx\") pod \"tigera-operator-cfc98749c-9vgbf\" (UID: \"22cb89e0-9104-450f-ac62-af5ee6df0404\") " pod="tigera-operator/tigera-operator-cfc98749c-9vgbf" Feb 13 09:55:11.814649 kubelet[2593]: I0213 09:55:11.814614 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/22cb89e0-9104-450f-ac62-af5ee6df0404-var-lib-calico\") pod \"tigera-operator-cfc98749c-9vgbf\" (UID: \"22cb89e0-9104-450f-ac62-af5ee6df0404\") " pod="tigera-operator/tigera-operator-cfc98749c-9vgbf" Feb 13 09:55:11.880597 env[1473]: time="2024-02-13T09:55:11.880473324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9g57l,Uid:0c09d2ba-f193-4428-8eb0-97a7bfd27906,Namespace:kube-system,Attempt:0,}" Feb 13 09:55:11.901801 env[1473]: time="2024-02-13T09:55:11.901599610Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 09:55:11.901801 env[1473]: time="2024-02-13T09:55:11.901716080Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 09:55:11.901801 env[1473]: time="2024-02-13T09:55:11.901752192Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 09:55:11.902239 env[1473]: time="2024-02-13T09:55:11.902108275Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/30b8c02c12b0568fd49df8ed3db05fb5c6e021803b5a325b78e38fa7b2a40952 pid=2805 runtime=io.containerd.runc.v2 Feb 13 09:55:11.929171 systemd[1]: Started cri-containerd-30b8c02c12b0568fd49df8ed3db05fb5c6e021803b5a325b78e38fa7b2a40952.scope. Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.948000 audit: BPF prog-id=112 op=LOAD Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2805 pid=2814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:11.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330623863303263313262303536386664343964663865643364623035 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001976b0 a2=3c a3=c items=0 ppid=2805 pid=2814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:11.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330623863303263313262303536386664343964663865643364623035 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.949000 audit: BPF prog-id=113 op=LOAD Feb 13 09:55:11.949000 audit[2814]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c0000937c0 items=0 ppid=2805 pid=2814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:11.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330623863303263313262303536386664343964663865643364623035 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit: BPF prog-id=114 op=LOAD Feb 13 09:55:11.950000 audit[2814]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=c000093808 items=0 ppid=2805 pid=2814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:11.950000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330623863303263313262303536386664343964663865643364623035 Feb 13 09:55:11.950000 audit: BPF prog-id=114 op=UNLOAD Feb 13 09:55:11.950000 audit: BPF prog-id=113 op=UNLOAD Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { perfmon } for pid=2814 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit[2814]: AVC avc: denied { bpf } for pid=2814 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:11.950000 audit: BPF prog-id=115 op=LOAD Feb 13 09:55:11.950000 audit[2814]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c000093c18 items=0 ppid=2805 pid=2814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:11.950000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330623863303263313262303536386664343964663865643364623035 Feb 13 09:55:11.972265 env[1473]: time="2024-02-13T09:55:11.972173477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9g57l,Uid:0c09d2ba-f193-4428-8eb0-97a7bfd27906,Namespace:kube-system,Attempt:0,} returns sandbox id \"30b8c02c12b0568fd49df8ed3db05fb5c6e021803b5a325b78e38fa7b2a40952\"" Feb 13 09:55:11.977199 env[1473]: time="2024-02-13T09:55:11.977094854Z" level=info msg="CreateContainer within sandbox \"30b8c02c12b0568fd49df8ed3db05fb5c6e021803b5a325b78e38fa7b2a40952\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 09:55:11.994750 env[1473]: time="2024-02-13T09:55:11.994631534Z" level=info msg="CreateContainer within sandbox \"30b8c02c12b0568fd49df8ed3db05fb5c6e021803b5a325b78e38fa7b2a40952\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"bea76a56532d71c7beb2c20d48667f09808cd3f0c552242ad38157d14d7495c1\"" Feb 13 09:55:11.995661 env[1473]: time="2024-02-13T09:55:11.995587062Z" level=info msg="StartContainer for \"bea76a56532d71c7beb2c20d48667f09808cd3f0c552242ad38157d14d7495c1\"" Feb 13 09:55:12.030977 systemd[1]: Started cri-containerd-bea76a56532d71c7beb2c20d48667f09808cd3f0c552242ad38157d14d7495c1.scope. Feb 13 09:55:12.063000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.063000 audit[2846]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001976b0 a2=3c a3=7f66f483c4e8 items=0 ppid=2805 pid=2846 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.063000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265613736613536353332643731633762656232633230643438363637 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit: BPF prog-id=116 op=LOAD Feb 13 09:55:12.064000 audit[2846]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001979d8 a2=78 a3=c000320bf8 items=0 ppid=2805 pid=2846 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.064000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265613736613536353332643731633762656232633230643438363637 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit: BPF prog-id=117 op=LOAD Feb 13 09:55:12.064000 audit[2846]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000197770 a2=78 a3=c000320c48 items=0 ppid=2805 pid=2846 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.064000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265613736613536353332643731633762656232633230643438363637 Feb 13 09:55:12.064000 audit: BPF prog-id=117 op=UNLOAD Feb 13 09:55:12.064000 audit: BPF prog-id=116 op=UNLOAD Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { perfmon } for pid=2846 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit[2846]: AVC avc: denied { bpf } for pid=2846 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.064000 audit: BPF prog-id=118 op=LOAD Feb 13 09:55:12.064000 audit[2846]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000197c30 a2=78 a3=c000320cd8 items=0 ppid=2805 pid=2846 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.064000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265613736613536353332643731633762656232633230643438363637 Feb 13 09:55:12.088557 env[1473]: time="2024-02-13T09:55:12.088480398Z" level=info msg="StartContainer for \"bea76a56532d71c7beb2c20d48667f09808cd3f0c552242ad38157d14d7495c1\" returns successfully" Feb 13 09:55:12.154000 audit[2904]: NETFILTER_CFG table=mangle:59 family=2 entries=1 op=nft_register_chain pid=2904 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.154000 audit[2904]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe8ef24f00 a2=0 a3=7ffe8ef24eec items=0 ppid=2856 pid=2904 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.154000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 09:55:12.154000 audit[2905]: NETFILTER_CFG table=mangle:60 family=10 entries=1 op=nft_register_chain pid=2905 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.154000 audit[2905]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe85c78b20 a2=0 a3=7ffe85c78b0c items=0 ppid=2856 pid=2905 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.154000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 09:55:12.155000 audit[2906]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=2906 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.155000 audit[2906]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdfdbc22c0 a2=0 a3=7ffdfdbc22ac items=0 ppid=2856 pid=2906 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.155000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 09:55:12.155000 audit[2907]: NETFILTER_CFG table=nat:62 family=10 entries=1 op=nft_register_chain pid=2907 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.155000 audit[2907]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd5c5a67b0 a2=0 a3=7ffd5c5a679c items=0 ppid=2856 pid=2907 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.155000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 09:55:12.156000 audit[2908]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=2908 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.156000 audit[2908]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe440bc120 a2=0 a3=7ffe440bc10c items=0 ppid=2856 pid=2908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.156000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 13 09:55:12.157000 audit[2909]: NETFILTER_CFG table=filter:64 family=10 entries=1 op=nft_register_chain pid=2909 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.157000 audit[2909]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd500cc450 a2=0 a3=7ffd500cc43c items=0 ppid=2856 pid=2909 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.157000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 13 09:55:12.262000 audit[2912]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=2912 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.262000 audit[2912]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe7c0c61b0 a2=0 a3=7ffe7c0c619c items=0 ppid=2856 pid=2912 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.262000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 13 09:55:12.269000 audit[2914]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=2914 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.269000 audit[2914]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc3c2328a0 a2=0 a3=7ffc3c23288c items=0 ppid=2856 pid=2914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.269000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Feb 13 09:55:12.278000 audit[2917]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=2917 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.278000 audit[2917]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffce88db3e0 a2=0 a3=7ffce88db3cc items=0 ppid=2856 pid=2917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.278000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Feb 13 09:55:12.281000 audit[2918]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=2918 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.281000 audit[2918]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffa61a1f20 a2=0 a3=7fffa61a1f0c items=0 ppid=2856 pid=2918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.281000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 13 09:55:12.287000 audit[2920]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=2920 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.287000 audit[2920]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffb6575ae0 a2=0 a3=7fffb6575acc items=0 ppid=2856 pid=2920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.287000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 13 09:55:12.290000 audit[2921]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=2921 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.290000 audit[2921]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe8e21da10 a2=0 a3=7ffe8e21d9fc items=0 ppid=2856 pid=2921 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.290000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 13 09:55:12.297000 audit[2923]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=2923 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.297000 audit[2923]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fffa8348ec0 a2=0 a3=7fffa8348eac items=0 ppid=2856 pid=2923 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.297000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 13 09:55:12.306000 audit[2926]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=2926 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.306000 audit[2926]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd099f6350 a2=0 a3=7ffd099f633c items=0 ppid=2856 pid=2926 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.306000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Feb 13 09:55:12.308000 audit[2927]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_chain pid=2927 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.308000 audit[2927]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffce2414fc0 a2=0 a3=7ffce2414fac items=0 ppid=2856 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.308000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 13 09:55:12.315000 audit[2929]: NETFILTER_CFG table=filter:74 family=2 entries=1 op=nft_register_rule pid=2929 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.315000 audit[2929]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe01a523f0 a2=0 a3=7ffe01a523dc items=0 ppid=2856 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.315000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 13 09:55:12.318000 audit[2930]: NETFILTER_CFG table=filter:75 family=2 entries=1 op=nft_register_chain pid=2930 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.318000 audit[2930]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffea1392d50 a2=0 a3=7ffea1392d3c items=0 ppid=2856 pid=2930 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.318000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 13 09:55:12.324000 audit[2932]: NETFILTER_CFG table=filter:76 family=2 entries=1 op=nft_register_rule pid=2932 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.324000 audit[2932]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff2aee2a10 a2=0 a3=7fff2aee29fc items=0 ppid=2856 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.324000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 09:55:12.333000 audit[2935]: NETFILTER_CFG table=filter:77 family=2 entries=1 op=nft_register_rule pid=2935 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.333000 audit[2935]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff8d73d230 a2=0 a3=7fff8d73d21c items=0 ppid=2856 pid=2935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.333000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 09:55:12.342000 audit[2938]: NETFILTER_CFG table=filter:78 family=2 entries=1 op=nft_register_rule pid=2938 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.342000 audit[2938]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffda09265e0 a2=0 a3=7ffda09265cc items=0 ppid=2856 pid=2938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.342000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 13 09:55:12.345000 audit[2939]: NETFILTER_CFG table=nat:79 family=2 entries=1 op=nft_register_chain pid=2939 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.345000 audit[2939]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffef46b3c40 a2=0 a3=7ffef46b3c2c items=0 ppid=2856 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.345000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 13 09:55:12.351000 audit[2941]: NETFILTER_CFG table=nat:80 family=2 entries=1 op=nft_register_rule pid=2941 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.351000 audit[2941]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff0291a120 a2=0 a3=7fff0291a10c items=0 ppid=2856 pid=2941 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.351000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 09:55:12.361000 audit[2944]: NETFILTER_CFG table=nat:81 family=2 entries=1 op=nft_register_rule pid=2944 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 09:55:12.361000 audit[2944]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc3d8665c0 a2=0 a3=7ffc3d8665ac items=0 ppid=2856 pid=2944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.361000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 09:55:12.387000 audit[2948]: NETFILTER_CFG table=filter:82 family=2 entries=6 op=nft_register_rule pid=2948 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:12.387000 audit[2948]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7fffcd1983c0 a2=0 a3=7fffcd1983ac items=0 ppid=2856 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.387000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:12.403000 audit[2948]: NETFILTER_CFG table=nat:83 family=2 entries=17 op=nft_register_chain pid=2948 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:12.403000 audit[2948]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7fffcd1983c0 a2=0 a3=7fffcd1983ac items=0 ppid=2856 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.403000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:12.406000 audit[2951]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=2951 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.406000 audit[2951]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe1fcc0240 a2=0 a3=7ffe1fcc022c items=0 ppid=2856 pid=2951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.406000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 13 09:55:12.412000 audit[2953]: NETFILTER_CFG table=filter:85 family=10 entries=2 op=nft_register_chain pid=2953 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.412000 audit[2953]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7fff6df4d0f0 a2=0 a3=7fff6df4d0dc items=0 ppid=2856 pid=2953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.412000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Feb 13 09:55:12.426000 audit[2956]: NETFILTER_CFG table=filter:86 family=10 entries=2 op=nft_register_chain pid=2956 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.426000 audit[2956]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffc0f91d7a0 a2=0 a3=7ffc0f91d78c items=0 ppid=2856 pid=2956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.426000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Feb 13 09:55:12.429000 audit[2957]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_chain pid=2957 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.429000 audit[2957]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc58510c50 a2=0 a3=7ffc58510c3c items=0 ppid=2856 pid=2957 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.429000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 13 09:55:12.435000 audit[2959]: NETFILTER_CFG table=filter:88 family=10 entries=1 op=nft_register_rule pid=2959 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.435000 audit[2959]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc6ef60c10 a2=0 a3=7ffc6ef60bfc items=0 ppid=2856 pid=2959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.435000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 13 09:55:12.438000 audit[2960]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=2960 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.438000 audit[2960]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc9758d280 a2=0 a3=7ffc9758d26c items=0 ppid=2856 pid=2960 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.438000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 13 09:55:12.444000 audit[2962]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=2962 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.444000 audit[2962]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff045e8930 a2=0 a3=7fff045e891c items=0 ppid=2856 pid=2962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.444000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Feb 13 09:55:12.453000 audit[2965]: NETFILTER_CFG table=filter:91 family=10 entries=2 op=nft_register_chain pid=2965 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.453000 audit[2965]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffcfa69abd0 a2=0 a3=7ffcfa69abbc items=0 ppid=2856 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.453000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 13 09:55:12.455000 audit[2966]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_chain pid=2966 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.455000 audit[2966]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffe32cf9d0 a2=0 a3=7fffe32cf9bc items=0 ppid=2856 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.455000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 13 09:55:12.461000 audit[2968]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=2968 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.461000 audit[2968]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe65bdf610 a2=0 a3=7ffe65bdf5fc items=0 ppid=2856 pid=2968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.461000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 13 09:55:12.464000 audit[2969]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_chain pid=2969 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.464000 audit[2969]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd5e5ad900 a2=0 a3=7ffd5e5ad8ec items=0 ppid=2856 pid=2969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.464000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 13 09:55:12.470000 audit[2971]: NETFILTER_CFG table=filter:95 family=10 entries=1 op=nft_register_rule pid=2971 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.470000 audit[2971]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc0c426470 a2=0 a3=7ffc0c42645c items=0 ppid=2856 pid=2971 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.470000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 09:55:12.479000 audit[2974]: NETFILTER_CFG table=filter:96 family=10 entries=1 op=nft_register_rule pid=2974 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.479000 audit[2974]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffceadc2ce0 a2=0 a3=7ffceadc2ccc items=0 ppid=2856 pid=2974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.479000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 13 09:55:12.489000 audit[2977]: NETFILTER_CFG table=filter:97 family=10 entries=1 op=nft_register_rule pid=2977 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.489000 audit[2977]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe76c94c20 a2=0 a3=7ffe76c94c0c items=0 ppid=2856 pid=2977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.489000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Feb 13 09:55:12.492000 audit[2978]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=2978 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.492000 audit[2978]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc8b11a420 a2=0 a3=7ffc8b11a40c items=0 ppid=2856 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.492000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 13 09:55:12.498000 audit[2980]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=2980 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.498000 audit[2980]: SYSCALL arch=c000003e syscall=46 success=yes exit=600 a0=3 a1=7ffe0d55b370 a2=0 a3=7ffe0d55b35c items=0 ppid=2856 pid=2980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.498000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 09:55:12.506000 audit[2983]: NETFILTER_CFG table=nat:100 family=10 entries=2 op=nft_register_chain pid=2983 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 09:55:12.506000 audit[2983]: SYSCALL arch=c000003e syscall=46 success=yes exit=608 a0=3 a1=7ffc1cc134a0 a2=0 a3=7ffc1cc1348c items=0 ppid=2856 pid=2983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.506000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 09:55:12.519000 audit[2987]: NETFILTER_CFG table=filter:101 family=10 entries=3 op=nft_register_rule pid=2987 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 13 09:55:12.519000 audit[2987]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7ffc2dbd5a80 a2=0 a3=7ffc2dbd5a6c items=0 ppid=2856 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.519000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:12.520000 audit[2987]: NETFILTER_CFG table=nat:102 family=10 entries=10 op=nft_register_chain pid=2987 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 13 09:55:12.520000 audit[2987]: SYSCALL arch=c000003e syscall=46 success=yes exit=1968 a0=3 a1=7ffc2dbd5a80 a2=0 a3=7ffc2dbd5a6c items=0 ppid=2856 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.520000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:12.675027 env[1473]: time="2024-02-13T09:55:12.674888241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-cfc98749c-9vgbf,Uid:22cb89e0-9104-450f-ac62-af5ee6df0404,Namespace:tigera-operator,Attempt:0,}" Feb 13 09:55:12.697325 env[1473]: time="2024-02-13T09:55:12.697205144Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 09:55:12.697325 env[1473]: time="2024-02-13T09:55:12.697295166Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 09:55:12.697616 env[1473]: time="2024-02-13T09:55:12.697330083Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 09:55:12.697745 env[1473]: time="2024-02-13T09:55:12.697674446Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/8582f78f08d94d131fc08379d181995e5a66682abda9319b30514023410d9cd2 pid=2996 runtime=io.containerd.runc.v2 Feb 13 09:55:12.722661 systemd[1]: Started cri-containerd-8582f78f08d94d131fc08379d181995e5a66682abda9319b30514023410d9cd2.scope. Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.746000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.747000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.747000 audit: BPF prog-id=119 op=LOAD Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2996 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835383266373866303864393464313331666330383337396431383139 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001976b0 a2=3c a3=c items=0 ppid=2996 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835383266373866303864393464313331666330383337396431383139 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit: BPF prog-id=120 op=LOAD Feb 13 09:55:12.748000 audit[3006]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001979d8 a2=78 a3=c00009bad0 items=0 ppid=2996 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835383266373866303864393464313331666330383337396431383139 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.748000 audit: BPF prog-id=121 op=LOAD Feb 13 09:55:12.748000 audit[3006]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000197770 a2=78 a3=c00009bb18 items=0 ppid=2996 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835383266373866303864393464313331666330383337396431383139 Feb 13 09:55:12.749000 audit: BPF prog-id=121 op=UNLOAD Feb 13 09:55:12.749000 audit: BPF prog-id=120 op=UNLOAD Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { perfmon } for pid=3006 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit[3006]: AVC avc: denied { bpf } for pid=3006 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:12.749000 audit: BPF prog-id=122 op=LOAD Feb 13 09:55:12.749000 audit[3006]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000197c30 a2=78 a3=c00009bf28 items=0 ppid=2996 pid=3006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:12.749000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835383266373866303864393464313331666330383337396431383139 Feb 13 09:55:12.799920 env[1473]: time="2024-02-13T09:55:12.799887334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-cfc98749c-9vgbf,Uid:22cb89e0-9104-450f-ac62-af5ee6df0404,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"8582f78f08d94d131fc08379d181995e5a66682abda9319b30514023410d9cd2\"" Feb 13 09:55:12.800902 env[1473]: time="2024-02-13T09:55:12.800859759Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\"" Feb 13 09:55:13.167146 kubelet[2593]: I0213 09:55:13.167097 2593 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-9g57l" podStartSLOduration=2.167074782 pod.CreationTimestamp="2024-02-13 09:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 09:55:13.166880467 +0000 UTC m=+14.380399915" watchObservedRunningTime="2024-02-13 09:55:13.167074782 +0000 UTC m=+14.380594228" Feb 13 09:55:13.914311 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3189634492.mount: Deactivated successfully. Feb 13 09:55:14.665629 env[1473]: time="2024-02-13T09:55:14.665580139Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:14.666187 env[1473]: time="2024-02-13T09:55:14.666145198Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:14.667175 env[1473]: time="2024-02-13T09:55:14.667118965Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:14.668029 env[1473]: time="2024-02-13T09:55:14.667983589Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:715ac9a30f8a9579e44258af20de354715429e11836b493918e9e1a696e9b028,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:14.668446 env[1473]: time="2024-02-13T09:55:14.668405573Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\" returns image reference \"sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827\"" Feb 13 09:55:14.669627 env[1473]: time="2024-02-13T09:55:14.669611934Z" level=info msg="CreateContainer within sandbox \"8582f78f08d94d131fc08379d181995e5a66682abda9319b30514023410d9cd2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 09:55:14.674655 env[1473]: time="2024-02-13T09:55:14.674619455Z" level=info msg="CreateContainer within sandbox \"8582f78f08d94d131fc08379d181995e5a66682abda9319b30514023410d9cd2\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3ced500e6ec77b9fb7031eae04998b00fcf2e3f5be338fb31d7b4ae96e53de9e\"" Feb 13 09:55:14.675083 env[1473]: time="2024-02-13T09:55:14.675019659Z" level=info msg="StartContainer for \"3ced500e6ec77b9fb7031eae04998b00fcf2e3f5be338fb31d7b4ae96e53de9e\"" Feb 13 09:55:14.684132 systemd[1]: Started cri-containerd-3ced500e6ec77b9fb7031eae04998b00fcf2e3f5be338fb31d7b4ae96e53de9e.scope. Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.717129 kernel: kauditd_printk_skb: 294 callbacks suppressed Feb 13 09:55:14.717167 kernel: audit: type=1400 audit(1707818114.688:1053): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.844687 kernel: audit: type=1400 audit(1707818114.688:1054): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.844712 kernel: audit: type=1400 audit(1707818114.688:1055): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.972719 kernel: audit: type=1400 audit(1707818114.688:1056): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.972745 kernel: audit: type=1400 audit(1707818114.688:1057): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.036857 kernel: audit: type=1400 audit(1707818114.688:1058): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.101036 kernel: audit: type=1400 audit(1707818114.688:1059): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.165203 kernel: audit: type=1400 audit(1707818114.688:1060): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.688000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.293416 kernel: audit: type=1400 audit(1707818114.688:1061): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.293457 kernel: audit: type=1400 audit(1707818114.779:1062): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit: BPF prog-id=123 op=LOAD Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000147c48 a2=10 a3=1c items=0 ppid=2996 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:14.779000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363656435303065366563373762396662373033316561653034393938 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001476b0 a2=3c a3=8 items=0 ppid=2996 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:14.779000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363656435303065366563373762396662373033316561653034393938 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.779000 audit: BPF prog-id=124 op=LOAD Feb 13 09:55:14.779000 audit[3035]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001479d8 a2=78 a3=c00023fd40 items=0 ppid=2996 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:14.779000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363656435303065366563373762396662373033316561653034393938 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:14.907000 audit: BPF prog-id=125 op=LOAD Feb 13 09:55:14.907000 audit[3035]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000147770 a2=78 a3=c00023fd88 items=0 ppid=2996 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:14.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363656435303065366563373762396662373033316561653034393938 Feb 13 09:55:15.035000 audit: BPF prog-id=125 op=UNLOAD Feb 13 09:55:15.035000 audit: BPF prog-id=124 op=UNLOAD Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { perfmon } for pid=3035 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit[3035]: AVC avc: denied { bpf } for pid=3035 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:15.035000 audit: BPF prog-id=126 op=LOAD Feb 13 09:55:15.035000 audit[3035]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000147c30 a2=78 a3=c00039a198 items=0 ppid=2996 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:15.035000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363656435303065366563373762396662373033316561653034393938 Feb 13 09:55:15.422269 env[1473]: time="2024-02-13T09:55:15.422199258Z" level=info msg="StartContainer for \"3ced500e6ec77b9fb7031eae04998b00fcf2e3f5be338fb31d7b4ae96e53de9e\" returns successfully" Feb 13 09:55:15.904389 kubelet[2593]: I0213 09:55:15.904362 2593 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-cfc98749c-9vgbf" podStartSLOduration=-9.22337203195045e+09 pod.CreationTimestamp="2024-02-13 09:55:11 +0000 UTC" firstStartedPulling="2024-02-13 09:55:12.800556759 +0000 UTC m=+14.014076214" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 09:55:15.904027429 +0000 UTC m=+17.117546881" watchObservedRunningTime="2024-02-13 09:55:15.904324204 +0000 UTC m=+17.117843650" Feb 13 09:55:16.982000 audit[3104]: NETFILTER_CFG table=filter:103 family=2 entries=13 op=nft_register_rule pid=3104 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:16.982000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffc95688fe0 a2=0 a3=7ffc95688fcc items=0 ppid=2856 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:16.982000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:16.983000 audit[3104]: NETFILTER_CFG table=nat:104 family=2 entries=20 op=nft_register_rule pid=3104 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:16.983000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffc95688fe0 a2=0 a3=7ffc95688fcc items=0 ppid=2856 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:16.983000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:17.038000 audit[3130]: NETFILTER_CFG table=filter:105 family=2 entries=14 op=nft_register_rule pid=3130 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:17.038000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffdc31cb380 a2=0 a3=7ffdc31cb36c items=0 ppid=2856 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.038000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:17.039000 audit[3130]: NETFILTER_CFG table=nat:106 family=2 entries=20 op=nft_register_rule pid=3130 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:17.039000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffdc31cb380 a2=0 a3=7ffdc31cb36c items=0 ppid=2856 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.039000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:17.109699 kubelet[2593]: I0213 09:55:17.109633 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:55:17.123612 systemd[1]: Created slice kubepods-besteffort-pod8fc26090_e86d_4c25_96ee_b0a7bf8af75f.slice. Feb 13 09:55:17.149129 kubelet[2593]: I0213 09:55:17.149103 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6zx6\" (UniqueName: \"kubernetes.io/projected/8fc26090-e86d-4c25-96ee-b0a7bf8af75f-kube-api-access-m6zx6\") pod \"calico-typha-5cb848556c-k44gx\" (UID: \"8fc26090-e86d-4c25-96ee-b0a7bf8af75f\") " pod="calico-system/calico-typha-5cb848556c-k44gx" Feb 13 09:55:17.149275 kubelet[2593]: I0213 09:55:17.149144 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc26090-e86d-4c25-96ee-b0a7bf8af75f-tigera-ca-bundle\") pod \"calico-typha-5cb848556c-k44gx\" (UID: \"8fc26090-e86d-4c25-96ee-b0a7bf8af75f\") " pod="calico-system/calico-typha-5cb848556c-k44gx" Feb 13 09:55:17.149275 kubelet[2593]: I0213 09:55:17.149171 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/8fc26090-e86d-4c25-96ee-b0a7bf8af75f-typha-certs\") pod \"calico-typha-5cb848556c-k44gx\" (UID: \"8fc26090-e86d-4c25-96ee-b0a7bf8af75f\") " pod="calico-system/calico-typha-5cb848556c-k44gx" Feb 13 09:55:17.156846 kubelet[2593]: I0213 09:55:17.156827 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:55:17.159794 systemd[1]: Created slice kubepods-besteffort-podb426eab0_a52b_41de_addb_7821036fa0b4.slice. Feb 13 09:55:17.249805 kubelet[2593]: I0213 09:55:17.249647 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-lib-modules\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.249805 kubelet[2593]: I0213 09:55:17.249733 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-xtables-lock\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.250141 kubelet[2593]: I0213 09:55:17.249835 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-flexvol-driver-host\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.250141 kubelet[2593]: I0213 09:55:17.249899 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-policysync\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.250141 kubelet[2593]: I0213 09:55:17.250097 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-cni-log-dir\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.250558 kubelet[2593]: I0213 09:55:17.250167 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b426eab0-a52b-41de-addb-7821036fa0b4-node-certs\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.250558 kubelet[2593]: I0213 09:55:17.250265 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-cni-net-dir\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.250558 kubelet[2593]: I0213 09:55:17.250379 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-var-run-calico\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.250558 kubelet[2593]: I0213 09:55:17.250468 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-var-lib-calico\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.251495 kubelet[2593]: I0213 09:55:17.250650 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b426eab0-a52b-41de-addb-7821036fa0b4-tigera-ca-bundle\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.251495 kubelet[2593]: I0213 09:55:17.250783 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b426eab0-a52b-41de-addb-7821036fa0b4-cni-bin-dir\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.251495 kubelet[2593]: I0213 09:55:17.250866 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmmkh\" (UniqueName: \"kubernetes.io/projected/b426eab0-a52b-41de-addb-7821036fa0b4-kube-api-access-mmmkh\") pod \"calico-node-cfb9g\" (UID: \"b426eab0-a52b-41de-addb-7821036fa0b4\") " pod="calico-system/calico-node-cfb9g" Feb 13 09:55:17.280677 kubelet[2593]: I0213 09:55:17.280614 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:55:17.281285 kubelet[2593]: E0213 09:55:17.281222 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:17.351171 kubelet[2593]: I0213 09:55:17.351124 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/70a6a2a2-80be-4700-bde4-cdae2bf45250-registration-dir\") pod \"csi-node-driver-w8xgk\" (UID: \"70a6a2a2-80be-4700-bde4-cdae2bf45250\") " pod="calico-system/csi-node-driver-w8xgk" Feb 13 09:55:17.351479 kubelet[2593]: I0213 09:55:17.351241 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/70a6a2a2-80be-4700-bde4-cdae2bf45250-varrun\") pod \"csi-node-driver-w8xgk\" (UID: \"70a6a2a2-80be-4700-bde4-cdae2bf45250\") " pod="calico-system/csi-node-driver-w8xgk" Feb 13 09:55:17.351630 kubelet[2593]: I0213 09:55:17.351470 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70a6a2a2-80be-4700-bde4-cdae2bf45250-kubelet-dir\") pod \"csi-node-driver-w8xgk\" (UID: \"70a6a2a2-80be-4700-bde4-cdae2bf45250\") " pod="calico-system/csi-node-driver-w8xgk" Feb 13 09:55:17.351826 kubelet[2593]: I0213 09:55:17.351788 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg46l\" (UniqueName: \"kubernetes.io/projected/70a6a2a2-80be-4700-bde4-cdae2bf45250-kube-api-access-qg46l\") pod \"csi-node-driver-w8xgk\" (UID: \"70a6a2a2-80be-4700-bde4-cdae2bf45250\") " pod="calico-system/csi-node-driver-w8xgk" Feb 13 09:55:17.352058 kubelet[2593]: I0213 09:55:17.352026 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/70a6a2a2-80be-4700-bde4-cdae2bf45250-socket-dir\") pod \"csi-node-driver-w8xgk\" (UID: \"70a6a2a2-80be-4700-bde4-cdae2bf45250\") " pod="calico-system/csi-node-driver-w8xgk" Feb 13 09:55:17.353833 kubelet[2593]: E0213 09:55:17.353795 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.353833 kubelet[2593]: W0213 09:55:17.353827 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.354126 kubelet[2593]: E0213 09:55:17.353873 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.356698 kubelet[2593]: E0213 09:55:17.356663 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.356698 kubelet[2593]: W0213 09:55:17.356684 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.356698 kubelet[2593]: E0213 09:55:17.356706 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.430768 env[1473]: time="2024-02-13T09:55:17.430639681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5cb848556c-k44gx,Uid:8fc26090-e86d-4c25-96ee-b0a7bf8af75f,Namespace:calico-system,Attempt:0,}" Feb 13 09:55:17.453838 kubelet[2593]: E0213 09:55:17.453775 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.453838 kubelet[2593]: W0213 09:55:17.453811 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.453838 kubelet[2593]: E0213 09:55:17.453853 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.454541 kubelet[2593]: E0213 09:55:17.454329 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.454541 kubelet[2593]: W0213 09:55:17.454385 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.454541 kubelet[2593]: E0213 09:55:17.454420 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.455038 kubelet[2593]: E0213 09:55:17.454965 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.455038 kubelet[2593]: W0213 09:55:17.455006 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.455423 kubelet[2593]: E0213 09:55:17.455067 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.455677 kubelet[2593]: E0213 09:55:17.455591 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.455677 kubelet[2593]: W0213 09:55:17.455623 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.455677 kubelet[2593]: E0213 09:55:17.455673 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.456139 kubelet[2593]: E0213 09:55:17.456113 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.456274 env[1473]: time="2024-02-13T09:55:17.455640936Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 09:55:17.456274 env[1473]: time="2024-02-13T09:55:17.455730612Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 09:55:17.456274 env[1473]: time="2024-02-13T09:55:17.455766112Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 09:55:17.456274 env[1473]: time="2024-02-13T09:55:17.456103953Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/6cb760095bb7cae0bc1ca37f08c6e651f85b4b39fb1134b7d6752c89b93b06e8 pid=3143 runtime=io.containerd.runc.v2 Feb 13 09:55:17.456729 kubelet[2593]: W0213 09:55:17.456154 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.456729 kubelet[2593]: E0213 09:55:17.456218 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.456729 kubelet[2593]: E0213 09:55:17.456695 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.456729 kubelet[2593]: W0213 09:55:17.456727 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.457116 kubelet[2593]: E0213 09:55:17.456822 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.457224 kubelet[2593]: E0213 09:55:17.457183 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.457224 kubelet[2593]: W0213 09:55:17.457208 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.457442 kubelet[2593]: E0213 09:55:17.457276 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.457850 kubelet[2593]: E0213 09:55:17.457775 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.457850 kubelet[2593]: W0213 09:55:17.457800 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.458181 kubelet[2593]: E0213 09:55:17.457902 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.458311 kubelet[2593]: E0213 09:55:17.458261 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.458311 kubelet[2593]: W0213 09:55:17.458285 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.458683 kubelet[2593]: E0213 09:55:17.458324 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.459085 kubelet[2593]: E0213 09:55:17.459037 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.459085 kubelet[2593]: W0213 09:55:17.459083 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.459488 kubelet[2593]: E0213 09:55:17.459153 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.459779 kubelet[2593]: E0213 09:55:17.459703 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.459779 kubelet[2593]: W0213 09:55:17.459738 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.460112 kubelet[2593]: E0213 09:55:17.459799 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.460291 kubelet[2593]: E0213 09:55:17.460259 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.460476 kubelet[2593]: W0213 09:55:17.460294 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.460476 kubelet[2593]: E0213 09:55:17.460397 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.460824 kubelet[2593]: E0213 09:55:17.460751 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.460824 kubelet[2593]: W0213 09:55:17.460782 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.461175 kubelet[2593]: E0213 09:55:17.460875 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.461291 kubelet[2593]: E0213 09:55:17.461224 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.461291 kubelet[2593]: W0213 09:55:17.461254 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.461554 kubelet[2593]: E0213 09:55:17.461333 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.461861 kubelet[2593]: E0213 09:55:17.461785 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.461861 kubelet[2593]: W0213 09:55:17.461818 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.462190 kubelet[2593]: E0213 09:55:17.461917 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.462317 kubelet[2593]: E0213 09:55:17.462235 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.462317 kubelet[2593]: W0213 09:55:17.462263 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.462646 kubelet[2593]: E0213 09:55:17.462368 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.462815 kubelet[2593]: E0213 09:55:17.462784 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.462982 kubelet[2593]: W0213 09:55:17.462821 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.462982 kubelet[2593]: E0213 09:55:17.462912 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.463319 kubelet[2593]: E0213 09:55:17.463215 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.463319 kubelet[2593]: W0213 09:55:17.463244 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.463571 kubelet[2593]: E0213 09:55:17.463335 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.463706 kubelet[2593]: E0213 09:55:17.463673 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.463870 kubelet[2593]: W0213 09:55:17.463706 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.463870 kubelet[2593]: E0213 09:55:17.463794 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.464301 kubelet[2593]: E0213 09:55:17.464266 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.464301 kubelet[2593]: W0213 09:55:17.464301 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.464671 kubelet[2593]: E0213 09:55:17.464414 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.464924 kubelet[2593]: E0213 09:55:17.464893 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.465076 kubelet[2593]: W0213 09:55:17.464929 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.465076 kubelet[2593]: E0213 09:55:17.465022 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.465523 kubelet[2593]: E0213 09:55:17.465493 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.465670 kubelet[2593]: W0213 09:55:17.465525 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.465670 kubelet[2593]: E0213 09:55:17.465612 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.466047 kubelet[2593]: E0213 09:55:17.466016 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.466197 kubelet[2593]: W0213 09:55:17.466053 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.466197 kubelet[2593]: E0213 09:55:17.466111 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.466796 kubelet[2593]: E0213 09:55:17.466727 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.466796 kubelet[2593]: W0213 09:55:17.466761 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.467097 kubelet[2593]: E0213 09:55:17.466826 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.467394 kubelet[2593]: E0213 09:55:17.467330 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.467548 kubelet[2593]: W0213 09:55:17.467392 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.467548 kubelet[2593]: E0213 09:55:17.467444 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.468125 kubelet[2593]: E0213 09:55:17.468074 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.468125 kubelet[2593]: W0213 09:55:17.468110 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.468359 kubelet[2593]: E0213 09:55:17.468163 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.481949 systemd[1]: Started cri-containerd-6cb760095bb7cae0bc1ca37f08c6e651f85b4b39fb1134b7d6752c89b93b06e8.scope. Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.494000 audit: BPF prog-id=127 op=LOAD Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=3143 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663623736303039356262376361653062633163613337663038633665 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001976b0 a2=3c a3=c items=0 ppid=3143 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663623736303039356262376361653062633163613337663038633665 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit: BPF prog-id=128 op=LOAD Feb 13 09:55:17.495000 audit[3164]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001979d8 a2=78 a3=c00028c930 items=0 ppid=3143 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663623736303039356262376361653062633163613337663038633665 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.495000 audit: BPF prog-id=129 op=LOAD Feb 13 09:55:17.495000 audit[3164]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000197770 a2=78 a3=c00028c978 items=0 ppid=3143 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663623736303039356262376361653062633163613337663038633665 Feb 13 09:55:17.495000 audit: BPF prog-id=129 op=UNLOAD Feb 13 09:55:17.495000 audit: BPF prog-id=128 op=UNLOAD Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { perfmon } for pid=3164 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit[3164]: AVC avc: denied { bpf } for pid=3164 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.496000 audit: BPF prog-id=130 op=LOAD Feb 13 09:55:17.496000 audit[3164]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000197c30 a2=78 a3=c00028cd88 items=0 ppid=3143 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663623736303039356262376361653062633163613337663038633665 Feb 13 09:55:17.534632 env[1473]: time="2024-02-13T09:55:17.534595773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5cb848556c-k44gx,Uid:8fc26090-e86d-4c25-96ee-b0a7bf8af75f,Namespace:calico-system,Attempt:0,} returns sandbox id \"6cb760095bb7cae0bc1ca37f08c6e651f85b4b39fb1134b7d6752c89b93b06e8\"" Feb 13 09:55:17.535961 env[1473]: time="2024-02-13T09:55:17.535813371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\"" Feb 13 09:55:17.562651 kubelet[2593]: E0213 09:55:17.562604 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.562651 kubelet[2593]: W0213 09:55:17.562618 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.562651 kubelet[2593]: E0213 09:55:17.562633 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.562793 kubelet[2593]: E0213 09:55:17.562752 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.562793 kubelet[2593]: W0213 09:55:17.562758 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.562793 kubelet[2593]: E0213 09:55:17.562766 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.664563 kubelet[2593]: E0213 09:55:17.664502 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.664563 kubelet[2593]: W0213 09:55:17.664542 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.664938 kubelet[2593]: E0213 09:55:17.664586 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.665115 kubelet[2593]: E0213 09:55:17.665094 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.665268 kubelet[2593]: W0213 09:55:17.665126 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.665268 kubelet[2593]: E0213 09:55:17.665166 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.731416 kubelet[2593]: E0213 09:55:17.731366 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.731416 kubelet[2593]: W0213 09:55:17.731401 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.731762 kubelet[2593]: E0213 09:55:17.731443 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.762814 env[1473]: time="2024-02-13T09:55:17.762597460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cfb9g,Uid:b426eab0-a52b-41de-addb-7821036fa0b4,Namespace:calico-system,Attempt:0,}" Feb 13 09:55:17.766515 kubelet[2593]: E0213 09:55:17.766472 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.766515 kubelet[2593]: W0213 09:55:17.766508 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.766855 kubelet[2593]: E0213 09:55:17.766550 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.784949 env[1473]: time="2024-02-13T09:55:17.784799611Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 09:55:17.784949 env[1473]: time="2024-02-13T09:55:17.784886071Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 09:55:17.784949 env[1473]: time="2024-02-13T09:55:17.784922338Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 09:55:17.785537 env[1473]: time="2024-02-13T09:55:17.785335696Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/dd7a8052e34dea3a5364f4620aefaf6e2242ec7c4b6ebf806a10a077d14a22c1 pid=3217 runtime=io.containerd.runc.v2 Feb 13 09:55:17.811128 systemd[1]: Started cri-containerd-dd7a8052e34dea3a5364f4620aefaf6e2242ec7c4b6ebf806a10a077d14a22c1.scope. Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.829000 audit: BPF prog-id=131 op=LOAD Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000145c48 a2=10 a3=1c items=0 ppid=3217 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.830000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464376138303532653334646561336135333634663436323061656661 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001456b0 a2=3c a3=c items=0 ppid=3217 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.830000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464376138303532653334646561336135333634663436323061656661 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit: BPF prog-id=132 op=LOAD Feb 13 09:55:17.830000 audit[3227]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001459d8 a2=78 a3=c0001d6aa0 items=0 ppid=3217 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.830000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464376138303532653334646561336135333634663436323061656661 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.830000 audit: BPF prog-id=133 op=LOAD Feb 13 09:55:17.830000 audit[3227]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000145770 a2=78 a3=c0001d6ae8 items=0 ppid=3217 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.830000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464376138303532653334646561336135333634663436323061656661 Feb 13 09:55:17.830000 audit: BPF prog-id=133 op=UNLOAD Feb 13 09:55:17.830000 audit: BPF prog-id=132 op=UNLOAD Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { perfmon } for pid=3227 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit[3227]: AVC avc: denied { bpf } for pid=3227 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:17.831000 audit: BPF prog-id=134 op=LOAD Feb 13 09:55:17.831000 audit[3227]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000145c30 a2=78 a3=c0001d6ef8 items=0 ppid=3217 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:17.831000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464376138303532653334646561336135333634663436323061656661 Feb 13 09:55:17.842936 env[1473]: time="2024-02-13T09:55:17.842874750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cfb9g,Uid:b426eab0-a52b-41de-addb-7821036fa0b4,Namespace:calico-system,Attempt:0,} returns sandbox id \"dd7a8052e34dea3a5364f4620aefaf6e2242ec7c4b6ebf806a10a077d14a22c1\"" Feb 13 09:55:17.868149 kubelet[2593]: E0213 09:55:17.868081 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.868149 kubelet[2593]: W0213 09:55:17.868104 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.868149 kubelet[2593]: E0213 09:55:17.868129 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:17.921548 kubelet[2593]: E0213 09:55:17.921515 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:17.921548 kubelet[2593]: W0213 09:55:17.921536 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:17.921752 kubelet[2593]: E0213 09:55:17.921561 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:18.133000 audit[3278]: NETFILTER_CFG table=filter:107 family=2 entries=14 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:18.133000 audit[3278]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffd9e4ba140 a2=0 a3=7ffd9e4ba12c items=0 ppid=2856 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:18.133000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:18.134000 audit[3278]: NETFILTER_CFG table=nat:108 family=2 entries=20 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:18.134000 audit[3278]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffd9e4ba140 a2=0 a3=7ffd9e4ba12c items=0 ppid=2856 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:18.134000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:18.856976 kubelet[2593]: E0213 09:55:18.856897 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:20.782225 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount606814495.mount: Deactivated successfully. Feb 13 09:55:20.853281 kubelet[2593]: E0213 09:55:20.853221 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:22.853793 kubelet[2593]: E0213 09:55:22.853708 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:24.853186 kubelet[2593]: E0213 09:55:24.853099 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:26.747417 env[1473]: time="2024-02-13T09:55:26.747361015Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:26.747933 env[1473]: time="2024-02-13T09:55:26.747894233Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:26.749219 env[1473]: time="2024-02-13T09:55:26.749179173Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:26.750081 env[1473]: time="2024-02-13T09:55:26.750033319Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:5f2d3b8c354a4eb6de46e786889913916e620c6c256982fb8d0f1a1d36a282bc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:26.750503 env[1473]: time="2024-02-13T09:55:26.750459470Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\" returns image reference \"sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c\"" Feb 13 09:55:26.750813 env[1473]: time="2024-02-13T09:55:26.750771127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\"" Feb 13 09:55:26.753991 env[1473]: time="2024-02-13T09:55:26.753944985Z" level=info msg="CreateContainer within sandbox \"6cb760095bb7cae0bc1ca37f08c6e651f85b4b39fb1134b7d6752c89b93b06e8\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 09:55:26.757682 env[1473]: time="2024-02-13T09:55:26.757639811Z" level=info msg="CreateContainer within sandbox \"6cb760095bb7cae0bc1ca37f08c6e651f85b4b39fb1134b7d6752c89b93b06e8\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2dcc2a061d01461b3eb9c61ceee05970709043ea2125295a9643c681e0598a6f\"" Feb 13 09:55:26.757887 env[1473]: time="2024-02-13T09:55:26.757843288Z" level=info msg="StartContainer for \"2dcc2a061d01461b3eb9c61ceee05970709043ea2125295a9643c681e0598a6f\"" Feb 13 09:55:26.765378 systemd[1]: Started cri-containerd-2dcc2a061d01461b3eb9c61ceee05970709043ea2125295a9643c681e0598a6f.scope. Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.799043 kernel: kauditd_printk_skb: 179 callbacks suppressed Feb 13 09:55:26.799090 kernel: audit: type=1400 audit(1707818126.770:1113): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.853753 kubelet[2593]: E0213 09:55:26.853714 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.926412 kernel: audit: type=1400 audit(1707818126.770:1114): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.926452 kernel: audit: type=1400 audit(1707818126.770:1115): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.990127 kernel: audit: type=1400 audit(1707818126.770:1116): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.054075 kernel: audit: type=1400 audit(1707818126.770:1117): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.118022 kernel: audit: type=1400 audit(1707818126.770:1118): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.182113 kernel: audit: type=1400 audit(1707818126.770:1119): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.246246 kernel: audit: type=1400 audit(1707818126.770:1120): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.310363 kernel: audit: type=1400 audit(1707818126.770:1121): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.770000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.374347 kernel: audit: type=1400 audit(1707818126.861:1122): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit: BPF prog-id=135 op=LOAD Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=3143 pid=3288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:26.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264636332613036316430313436316233656239633631636565653035 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001976b0 a2=3c a3=8 items=0 ppid=3143 pid=3288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:26.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264636332613036316430313436316233656239633631636565653035 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.861000 audit: BPF prog-id=136 op=LOAD Feb 13 09:55:26.861000 audit[3288]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c000240750 items=0 ppid=3143 pid=3288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:26.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264636332613036316430313436316233656239633631636565653035 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:26.925000 audit: BPF prog-id=137 op=LOAD Feb 13 09:55:26.925000 audit[3288]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=c000240798 items=0 ppid=3143 pid=3288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:26.925000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264636332613036316430313436316233656239633631636565653035 Feb 13 09:55:27.052000 audit: BPF prog-id=137 op=UNLOAD Feb 13 09:55:27.052000 audit: BPF prog-id=136 op=UNLOAD Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { perfmon } for pid=3288 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit[3288]: AVC avc: denied { bpf } for pid=3288 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:27.052000 audit: BPF prog-id=138 op=LOAD Feb 13 09:55:27.052000 audit[3288]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c000240ba8 items=0 ppid=3143 pid=3288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:27.052000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264636332613036316430313436316233656239633631636565653035 Feb 13 09:55:27.453255 env[1473]: time="2024-02-13T09:55:27.453225702Z" level=info msg="StartContainer for \"2dcc2a061d01461b3eb9c61ceee05970709043ea2125295a9643c681e0598a6f\" returns successfully" Feb 13 09:55:27.925164 kubelet[2593]: E0213 09:55:27.925076 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.925164 kubelet[2593]: W0213 09:55:27.925112 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.925164 kubelet[2593]: E0213 09:55:27.925153 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.926398 kubelet[2593]: E0213 09:55:27.925697 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.926398 kubelet[2593]: W0213 09:55:27.925733 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.926398 kubelet[2593]: E0213 09:55:27.925787 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.926398 kubelet[2593]: E0213 09:55:27.926243 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.926398 kubelet[2593]: W0213 09:55:27.926280 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.926398 kubelet[2593]: E0213 09:55:27.926333 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.927205 kubelet[2593]: E0213 09:55:27.926937 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.927205 kubelet[2593]: W0213 09:55:27.926966 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.927205 kubelet[2593]: E0213 09:55:27.927001 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.927560 kubelet[2593]: E0213 09:55:27.927391 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.927560 kubelet[2593]: W0213 09:55:27.927421 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.927560 kubelet[2593]: E0213 09:55:27.927470 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.928024 kubelet[2593]: E0213 09:55:27.927948 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.928024 kubelet[2593]: W0213 09:55:27.927982 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.928024 kubelet[2593]: E0213 09:55:27.928030 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.928688 kubelet[2593]: E0213 09:55:27.928612 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.928688 kubelet[2593]: W0213 09:55:27.928646 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.928995 kubelet[2593]: E0213 09:55:27.928705 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.929203 kubelet[2593]: E0213 09:55:27.929172 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.929309 kubelet[2593]: W0213 09:55:27.929207 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.929309 kubelet[2593]: E0213 09:55:27.929255 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.929799 kubelet[2593]: E0213 09:55:27.929724 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.929799 kubelet[2593]: W0213 09:55:27.929757 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.929799 kubelet[2593]: E0213 09:55:27.929803 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.930263 kubelet[2593]: E0213 09:55:27.930201 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.930263 kubelet[2593]: W0213 09:55:27.930223 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.930263 kubelet[2593]: E0213 09:55:27.930251 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.930702 kubelet[2593]: E0213 09:55:27.930628 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.930702 kubelet[2593]: W0213 09:55:27.930652 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.930702 kubelet[2593]: E0213 09:55:27.930679 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.931096 kubelet[2593]: E0213 09:55:27.931032 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.931096 kubelet[2593]: W0213 09:55:27.931053 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.931096 kubelet[2593]: E0213 09:55:27.931079 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.941991 kubelet[2593]: I0213 09:55:27.941913 2593 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-5cb848556c-k44gx" podStartSLOduration=-9.22337202591294e+09 pod.CreationTimestamp="2024-02-13 09:55:17 +0000 UTC" firstStartedPulling="2024-02-13 09:55:17.535324678 +0000 UTC m=+18.748844133" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 09:55:27.940789756 +0000 UTC m=+29.154309290" watchObservedRunningTime="2024-02-13 09:55:27.941835918 +0000 UTC m=+29.155355407" Feb 13 09:55:27.943272 kubelet[2593]: E0213 09:55:27.943218 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.943272 kubelet[2593]: W0213 09:55:27.943261 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.943662 kubelet[2593]: E0213 09:55:27.943300 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.943948 kubelet[2593]: E0213 09:55:27.943912 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.943948 kubelet[2593]: W0213 09:55:27.943944 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.944194 kubelet[2593]: E0213 09:55:27.943995 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.944595 kubelet[2593]: E0213 09:55:27.944559 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.944595 kubelet[2593]: W0213 09:55:27.944591 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.944877 kubelet[2593]: E0213 09:55:27.944649 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.945126 kubelet[2593]: E0213 09:55:27.945098 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.945126 kubelet[2593]: W0213 09:55:27.945122 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.945396 kubelet[2593]: E0213 09:55:27.945157 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.945635 kubelet[2593]: E0213 09:55:27.945598 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.945635 kubelet[2593]: W0213 09:55:27.945630 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.945883 kubelet[2593]: E0213 09:55:27.945774 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.946055 kubelet[2593]: E0213 09:55:27.946001 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.946055 kubelet[2593]: W0213 09:55:27.946028 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.946274 kubelet[2593]: E0213 09:55:27.946135 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.946619 kubelet[2593]: E0213 09:55:27.946544 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.946619 kubelet[2593]: W0213 09:55:27.946575 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.946938 kubelet[2593]: E0213 09:55:27.946684 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.947124 kubelet[2593]: E0213 09:55:27.947076 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.947124 kubelet[2593]: W0213 09:55:27.947102 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.947329 kubelet[2593]: E0213 09:55:27.947140 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.947744 kubelet[2593]: E0213 09:55:27.947689 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.947744 kubelet[2593]: W0213 09:55:27.947721 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.948049 kubelet[2593]: E0213 09:55:27.947763 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.948269 kubelet[2593]: E0213 09:55:27.948242 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.948403 kubelet[2593]: W0213 09:55:27.948267 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.948403 kubelet[2593]: E0213 09:55:27.948354 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.948840 kubelet[2593]: E0213 09:55:27.948752 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.948840 kubelet[2593]: W0213 09:55:27.948787 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.949156 kubelet[2593]: E0213 09:55:27.948938 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.949310 kubelet[2593]: E0213 09:55:27.949283 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.949482 kubelet[2593]: W0213 09:55:27.949310 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.949482 kubelet[2593]: E0213 09:55:27.949408 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.949842 kubelet[2593]: E0213 09:55:27.949774 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.949842 kubelet[2593]: W0213 09:55:27.949804 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.949842 kubelet[2593]: E0213 09:55:27.949845 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.950363 kubelet[2593]: E0213 09:55:27.950314 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.950363 kubelet[2593]: W0213 09:55:27.950363 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.950579 kubelet[2593]: E0213 09:55:27.950404 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.951010 kubelet[2593]: E0213 09:55:27.950953 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.951010 kubelet[2593]: W0213 09:55:27.950984 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.951248 kubelet[2593]: E0213 09:55:27.951026 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.951626 kubelet[2593]: E0213 09:55:27.951557 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.951626 kubelet[2593]: W0213 09:55:27.951588 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.951626 kubelet[2593]: E0213 09:55:27.951630 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.952371 kubelet[2593]: E0213 09:55:27.952318 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.952498 kubelet[2593]: W0213 09:55:27.952371 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.952498 kubelet[2593]: E0213 09:55:27.952414 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:27.953095 kubelet[2593]: E0213 09:55:27.953041 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:27.953095 kubelet[2593]: W0213 09:55:27.953066 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:27.953095 kubelet[2593]: E0213 09:55:27.953099 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.854306 kubelet[2593]: E0213 09:55:28.854202 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:28.924226 kubelet[2593]: I0213 09:55:28.924172 2593 prober_manager.go:287] "Failed to trigger a manual run" probe="Readiness" Feb 13 09:55:28.938621 kubelet[2593]: E0213 09:55:28.938548 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.938621 kubelet[2593]: W0213 09:55:28.938577 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.938621 kubelet[2593]: E0213 09:55:28.938619 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.939576 kubelet[2593]: E0213 09:55:28.939017 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.939576 kubelet[2593]: W0213 09:55:28.939038 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.939576 kubelet[2593]: E0213 09:55:28.939066 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.939576 kubelet[2593]: E0213 09:55:28.939503 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.939576 kubelet[2593]: W0213 09:55:28.939525 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.939576 kubelet[2593]: E0213 09:55:28.939556 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.940207 kubelet[2593]: E0213 09:55:28.940020 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.940207 kubelet[2593]: W0213 09:55:28.940043 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.940207 kubelet[2593]: E0213 09:55:28.940073 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.940567 kubelet[2593]: E0213 09:55:28.940518 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.940567 kubelet[2593]: W0213 09:55:28.940548 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.940766 kubelet[2593]: E0213 09:55:28.940584 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.941104 kubelet[2593]: E0213 09:55:28.941042 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.941104 kubelet[2593]: W0213 09:55:28.941066 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.941104 kubelet[2593]: E0213 09:55:28.941099 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.941700 kubelet[2593]: E0213 09:55:28.941617 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.941700 kubelet[2593]: W0213 09:55:28.941648 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.941700 kubelet[2593]: E0213 09:55:28.941683 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.942118 kubelet[2593]: E0213 09:55:28.942089 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.942118 kubelet[2593]: W0213 09:55:28.942112 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.942379 kubelet[2593]: E0213 09:55:28.942146 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.942588 kubelet[2593]: E0213 09:55:28.942545 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.942588 kubelet[2593]: W0213 09:55:28.942570 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.942815 kubelet[2593]: E0213 09:55:28.942601 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.943075 kubelet[2593]: E0213 09:55:28.943030 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.943075 kubelet[2593]: W0213 09:55:28.943054 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.943315 kubelet[2593]: E0213 09:55:28.943084 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.943486 kubelet[2593]: E0213 09:55:28.943462 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.943486 kubelet[2593]: W0213 09:55:28.943484 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.943709 kubelet[2593]: E0213 09:55:28.943513 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.943986 kubelet[2593]: E0213 09:55:28.943917 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.943986 kubelet[2593]: W0213 09:55:28.943939 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.943986 kubelet[2593]: E0213 09:55:28.943967 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.953524 kubelet[2593]: E0213 09:55:28.953454 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.953524 kubelet[2593]: W0213 09:55:28.953485 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.953524 kubelet[2593]: E0213 09:55:28.953520 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.954098 kubelet[2593]: E0213 09:55:28.954023 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.954098 kubelet[2593]: W0213 09:55:28.954065 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.954439 kubelet[2593]: E0213 09:55:28.954129 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.954749 kubelet[2593]: E0213 09:55:28.954681 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.954749 kubelet[2593]: W0213 09:55:28.954716 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.954749 kubelet[2593]: E0213 09:55:28.954758 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.955278 kubelet[2593]: E0213 09:55:28.955225 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.955278 kubelet[2593]: W0213 09:55:28.955257 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.955563 kubelet[2593]: E0213 09:55:28.955299 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.955859 kubelet[2593]: E0213 09:55:28.955790 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.955859 kubelet[2593]: W0213 09:55:28.955822 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.956159 kubelet[2593]: E0213 09:55:28.955936 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.956362 kubelet[2593]: E0213 09:55:28.956314 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.956501 kubelet[2593]: W0213 09:55:28.956380 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.956624 kubelet[2593]: E0213 09:55:28.956503 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.956868 kubelet[2593]: E0213 09:55:28.956843 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.956978 kubelet[2593]: W0213 09:55:28.956868 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.957077 kubelet[2593]: E0213 09:55:28.956984 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.957319 kubelet[2593]: E0213 09:55:28.957293 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.957319 kubelet[2593]: W0213 09:55:28.957317 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.957576 kubelet[2593]: E0213 09:55:28.957378 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.957979 kubelet[2593]: E0213 09:55:28.957949 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.958101 kubelet[2593]: W0213 09:55:28.957981 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.958101 kubelet[2593]: E0213 09:55:28.958023 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.958495 kubelet[2593]: E0213 09:55:28.958432 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.958495 kubelet[2593]: W0213 09:55:28.958455 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.958816 kubelet[2593]: E0213 09:55:28.958569 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.958953 kubelet[2593]: E0213 09:55:28.958893 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.958953 kubelet[2593]: W0213 09:55:28.958922 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.959144 kubelet[2593]: E0213 09:55:28.959031 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.959428 kubelet[2593]: E0213 09:55:28.959332 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.959428 kubelet[2593]: W0213 09:55:28.959385 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.959762 kubelet[2593]: E0213 09:55:28.959493 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.959894 kubelet[2593]: E0213 09:55:28.959828 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.959894 kubelet[2593]: W0213 09:55:28.959857 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.960083 kubelet[2593]: E0213 09:55:28.959899 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.960326 kubelet[2593]: E0213 09:55:28.960299 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.960326 kubelet[2593]: W0213 09:55:28.960323 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.960665 kubelet[2593]: E0213 09:55:28.960381 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.961052 kubelet[2593]: E0213 09:55:28.960984 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.961052 kubelet[2593]: W0213 09:55:28.961014 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.961052 kubelet[2593]: E0213 09:55:28.961055 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.961620 kubelet[2593]: E0213 09:55:28.961544 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.961620 kubelet[2593]: W0213 09:55:28.961576 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.961620 kubelet[2593]: E0213 09:55:28.961619 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.962218 kubelet[2593]: E0213 09:55:28.962161 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.962218 kubelet[2593]: W0213 09:55:28.962191 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.962491 kubelet[2593]: E0213 09:55:28.962233 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:28.962769 kubelet[2593]: E0213 09:55:28.962700 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:28.962769 kubelet[2593]: W0213 09:55:28.962731 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:28.962769 kubelet[2593]: E0213 09:55:28.962766 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:30.854080 kubelet[2593]: E0213 09:55:30.853969 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:32.854171 kubelet[2593]: E0213 09:55:32.854069 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:34.853999 kubelet[2593]: E0213 09:55:34.853892 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:35.170385 kubelet[2593]: I0213 09:55:35.170239 2593 prober_manager.go:287] "Failed to trigger a manual run" probe="Readiness" Feb 13 09:55:35.188502 kubelet[2593]: E0213 09:55:35.188459 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.188502 kubelet[2593]: W0213 09:55:35.188470 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.188502 kubelet[2593]: E0213 09:55:35.188483 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.188642 kubelet[2593]: E0213 09:55:35.188622 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.188642 kubelet[2593]: W0213 09:55:35.188628 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.188642 kubelet[2593]: E0213 09:55:35.188635 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.188788 kubelet[2593]: E0213 09:55:35.188753 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.188788 kubelet[2593]: W0213 09:55:35.188759 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.188788 kubelet[2593]: E0213 09:55:35.188767 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.188921 kubelet[2593]: E0213 09:55:35.188886 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.188921 kubelet[2593]: W0213 09:55:35.188892 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.188921 kubelet[2593]: E0213 09:55:35.188899 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.189040 kubelet[2593]: E0213 09:55:35.189004 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.189040 kubelet[2593]: W0213 09:55:35.189010 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.189040 kubelet[2593]: E0213 09:55:35.189017 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.189107 kubelet[2593]: E0213 09:55:35.189080 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.189107 kubelet[2593]: W0213 09:55:35.189084 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.189107 kubelet[2593]: E0213 09:55:35.189090 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.189180 kubelet[2593]: E0213 09:55:35.189175 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.189180 kubelet[2593]: W0213 09:55:35.189179 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.189218 kubelet[2593]: E0213 09:55:35.189185 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.189246 kubelet[2593]: E0213 09:55:35.189242 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.189268 kubelet[2593]: W0213 09:55:35.189246 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.189268 kubelet[2593]: E0213 09:55:35.189252 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.189323 kubelet[2593]: E0213 09:55:35.189309 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.189323 kubelet[2593]: W0213 09:55:35.189313 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.189323 kubelet[2593]: E0213 09:55:35.189319 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.189455 kubelet[2593]: E0213 09:55:35.189450 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.189455 kubelet[2593]: W0213 09:55:35.189455 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.189499 kubelet[2593]: E0213 09:55:35.189460 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.189534 kubelet[2593]: E0213 09:55:35.189529 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.189534 kubelet[2593]: W0213 09:55:35.189534 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.189574 kubelet[2593]: E0213 09:55:35.189539 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.189602 kubelet[2593]: E0213 09:55:35.189597 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.189623 kubelet[2593]: W0213 09:55:35.189602 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.189623 kubelet[2593]: E0213 09:55:35.189607 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.198222 kubelet[2593]: E0213 09:55:35.198208 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.198222 kubelet[2593]: W0213 09:55:35.198218 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.198343 kubelet[2593]: E0213 09:55:35.198230 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.198369 kubelet[2593]: E0213 09:55:35.198356 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.198369 kubelet[2593]: W0213 09:55:35.198362 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.198412 kubelet[2593]: E0213 09:55:35.198373 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.198530 kubelet[2593]: E0213 09:55:35.198486 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.198530 kubelet[2593]: W0213 09:55:35.198492 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.198530 kubelet[2593]: E0213 09:55:35.198501 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.198669 kubelet[2593]: E0213 09:55:35.198622 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.198669 kubelet[2593]: W0213 09:55:35.198627 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.198669 kubelet[2593]: E0213 09:55:35.198636 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.198747 kubelet[2593]: E0213 09:55:35.198716 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.198747 kubelet[2593]: W0213 09:55:35.198720 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.198747 kubelet[2593]: E0213 09:55:35.198728 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.198801 kubelet[2593]: E0213 09:55:35.198794 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.198801 kubelet[2593]: W0213 09:55:35.198799 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.198838 kubelet[2593]: E0213 09:55:35.198805 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.198952 kubelet[2593]: E0213 09:55:35.198918 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.198952 kubelet[2593]: W0213 09:55:35.198922 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.198952 kubelet[2593]: E0213 09:55:35.198928 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.199105 kubelet[2593]: E0213 09:55:35.199095 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.199105 kubelet[2593]: W0213 09:55:35.199102 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.199191 kubelet[2593]: E0213 09:55:35.199112 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.199191 kubelet[2593]: E0213 09:55:35.199190 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.199262 kubelet[2593]: W0213 09:55:35.199196 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.199262 kubelet[2593]: E0213 09:55:35.199207 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.199318 kubelet[2593]: E0213 09:55:35.199299 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.199318 kubelet[2593]: W0213 09:55:35.199305 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.199318 kubelet[2593]: E0213 09:55:35.199315 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.199419 kubelet[2593]: E0213 09:55:35.199400 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.199419 kubelet[2593]: W0213 09:55:35.199406 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.199419 kubelet[2593]: E0213 09:55:35.199418 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.199545 kubelet[2593]: E0213 09:55:35.199538 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.199545 kubelet[2593]: W0213 09:55:35.199544 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.199613 kubelet[2593]: E0213 09:55:35.199554 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.199718 kubelet[2593]: E0213 09:55:35.199711 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.199745 kubelet[2593]: W0213 09:55:35.199718 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.199745 kubelet[2593]: E0213 09:55:35.199726 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.199832 kubelet[2593]: E0213 09:55:35.199828 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.199856 kubelet[2593]: W0213 09:55:35.199832 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.199856 kubelet[2593]: E0213 09:55:35.199839 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.199940 kubelet[2593]: E0213 09:55:35.199936 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.199940 kubelet[2593]: W0213 09:55:35.199940 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.199985 kubelet[2593]: E0213 09:55:35.199946 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.200047 kubelet[2593]: E0213 09:55:35.200042 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.200047 kubelet[2593]: W0213 09:55:35.200047 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.200087 kubelet[2593]: E0213 09:55:35.200053 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.200220 kubelet[2593]: E0213 09:55:35.200176 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.200220 kubelet[2593]: W0213 09:55:35.200183 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.200220 kubelet[2593]: E0213 09:55:35.200192 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.200293 kubelet[2593]: E0213 09:55:35.200263 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.200293 kubelet[2593]: W0213 09:55:35.200267 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.200293 kubelet[2593]: E0213 09:55:35.200276 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.206000 audit[3445]: NETFILTER_CFG table=filter:109 family=2 entries=13 op=nft_register_rule pid=3445 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:35.234199 kernel: kauditd_printk_skb: 47 callbacks suppressed Feb 13 09:55:35.234245 kernel: audit: type=1325 audit(1707818135.206:1131): table=filter:109 family=2 entries=13 op=nft_register_rule pid=3445 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:35.206000 audit[3445]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7ffe8aa5b080 a2=0 a3=7ffe8aa5b06c items=0 ppid=2856 pid=3445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:35.295407 kernel: audit: type=1300 audit(1707818135.206:1131): arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7ffe8aa5b080 a2=0 a3=7ffe8aa5b06c items=0 ppid=2856 pid=3445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:35.206000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:35.451936 kernel: audit: type=1327 audit(1707818135.206:1131): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:35.451968 kernel: audit: type=1325 audit(1707818135.206:1132): table=nat:110 family=2 entries=27 op=nft_register_chain pid=3445 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:35.206000 audit[3445]: NETFILTER_CFG table=nat:110 family=2 entries=27 op=nft_register_chain pid=3445 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 09:55:35.511826 kernel: audit: type=1300 audit(1707818135.206:1132): arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffe8aa5b080 a2=0 a3=7ffe8aa5b06c items=0 ppid=2856 pid=3445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:35.206000 audit[3445]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffe8aa5b080 a2=0 a3=7ffe8aa5b06c items=0 ppid=2856 pid=3445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:35.610604 kernel: audit: type=1327 audit(1707818135.206:1132): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:35.206000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 09:55:35.995475 kubelet[2593]: E0213 09:55:35.995422 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.995475 kubelet[2593]: W0213 09:55:35.995463 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.996414 kubelet[2593]: E0213 09:55:35.995511 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.996414 kubelet[2593]: E0213 09:55:35.995967 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.996414 kubelet[2593]: W0213 09:55:35.995992 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.996414 kubelet[2593]: E0213 09:55:35.996027 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.996937 kubelet[2593]: E0213 09:55:35.996538 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.996937 kubelet[2593]: W0213 09:55:35.996571 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.996937 kubelet[2593]: E0213 09:55:35.996611 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.997265 kubelet[2593]: E0213 09:55:35.997205 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.997265 kubelet[2593]: W0213 09:55:35.997235 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.997520 kubelet[2593]: E0213 09:55:35.997270 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.997833 kubelet[2593]: E0213 09:55:35.997757 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.997833 kubelet[2593]: W0213 09:55:35.997788 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.997833 kubelet[2593]: E0213 09:55:35.997823 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.998293 kubelet[2593]: E0213 09:55:35.998267 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.998293 kubelet[2593]: W0213 09:55:35.998292 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.998566 kubelet[2593]: E0213 09:55:35.998324 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.999001 kubelet[2593]: E0213 09:55:35.998920 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.999001 kubelet[2593]: W0213 09:55:35.998951 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.999001 kubelet[2593]: E0213 09:55:35.998987 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.999460 kubelet[2593]: E0213 09:55:35.999421 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.999460 kubelet[2593]: W0213 09:55:35.999444 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:35.999719 kubelet[2593]: E0213 09:55:35.999478 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:35.999969 kubelet[2593]: E0213 09:55:35.999913 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:35.999969 kubelet[2593]: W0213 09:55:35.999944 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.000240 kubelet[2593]: E0213 09:55:35.999980 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.000470 kubelet[2593]: E0213 09:55:36.000395 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.000470 kubelet[2593]: W0213 09:55:36.000418 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.000470 kubelet[2593]: E0213 09:55:36.000447 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.000903 kubelet[2593]: E0213 09:55:36.000855 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.000903 kubelet[2593]: W0213 09:55:36.000885 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.001096 kubelet[2593]: E0213 09:55:36.000921 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.001359 kubelet[2593]: E0213 09:55:36.001310 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.001359 kubelet[2593]: W0213 09:55:36.001333 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.001597 kubelet[2593]: E0213 09:55:36.001390 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.004429 kubelet[2593]: E0213 09:55:36.004372 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.004429 kubelet[2593]: W0213 09:55:36.004410 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.004759 kubelet[2593]: E0213 09:55:36.004452 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.005057 kubelet[2593]: E0213 09:55:36.004989 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.005057 kubelet[2593]: W0213 09:55:36.005020 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.005057 kubelet[2593]: E0213 09:55:36.005061 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.005675 kubelet[2593]: E0213 09:55:36.005604 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.005675 kubelet[2593]: W0213 09:55:36.005638 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.005675 kubelet[2593]: E0213 09:55:36.005683 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.006175 kubelet[2593]: E0213 09:55:36.006117 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.006175 kubelet[2593]: W0213 09:55:36.006148 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.006511 kubelet[2593]: E0213 09:55:36.006190 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.006720 kubelet[2593]: E0213 09:55:36.006652 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.006720 kubelet[2593]: W0213 09:55:36.006682 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.007012 kubelet[2593]: E0213 09:55:36.006765 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.007147 kubelet[2593]: E0213 09:55:36.007121 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.007147 kubelet[2593]: W0213 09:55:36.007145 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.007416 kubelet[2593]: E0213 09:55:36.007206 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.007676 kubelet[2593]: E0213 09:55:36.007606 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.007676 kubelet[2593]: W0213 09:55:36.007638 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.007968 kubelet[2593]: E0213 09:55:36.007730 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.008146 kubelet[2593]: E0213 09:55:36.008119 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.008269 kubelet[2593]: W0213 09:55:36.008145 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.008269 kubelet[2593]: E0213 09:55:36.008192 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.008733 kubelet[2593]: E0213 09:55:36.008706 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.008733 kubelet[2593]: W0213 09:55:36.008732 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.008938 kubelet[2593]: E0213 09:55:36.008779 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.009222 kubelet[2593]: E0213 09:55:36.009197 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.009395 kubelet[2593]: W0213 09:55:36.009221 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.009395 kubelet[2593]: E0213 09:55:36.009298 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.009749 kubelet[2593]: E0213 09:55:36.009719 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.009860 kubelet[2593]: W0213 09:55:36.009751 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.009860 kubelet[2593]: E0213 09:55:36.009839 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.010180 kubelet[2593]: E0213 09:55:36.010156 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.010297 kubelet[2593]: W0213 09:55:36.010181 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.010297 kubelet[2593]: E0213 09:55:36.010262 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.010767 kubelet[2593]: E0213 09:55:36.010691 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.010767 kubelet[2593]: W0213 09:55:36.010722 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.010767 kubelet[2593]: E0213 09:55:36.010775 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.011304 kubelet[2593]: E0213 09:55:36.011277 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.011304 kubelet[2593]: W0213 09:55:36.011303 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.011572 kubelet[2593]: E0213 09:55:36.011361 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.011953 kubelet[2593]: E0213 09:55:36.011908 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.011953 kubelet[2593]: W0213 09:55:36.011941 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.012288 kubelet[2593]: E0213 09:55:36.011986 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.012451 kubelet[2593]: E0213 09:55:36.012427 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.012451 kubelet[2593]: W0213 09:55:36.012450 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.012731 kubelet[2593]: E0213 09:55:36.012485 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.013135 kubelet[2593]: E0213 09:55:36.013099 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.013135 kubelet[2593]: W0213 09:55:36.013132 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.013395 kubelet[2593]: E0213 09:55:36.013175 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.013761 kubelet[2593]: E0213 09:55:36.013701 2593 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 09:55:36.013989 kubelet[2593]: W0213 09:55:36.013767 2593 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 09:55:36.013989 kubelet[2593]: E0213 09:55:36.013826 2593 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 09:55:36.853376 kubelet[2593]: E0213 09:55:36.853236 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:38.853325 kubelet[2593]: E0213 09:55:38.853307 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:40.853917 kubelet[2593]: E0213 09:55:40.853830 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:42.853594 kubelet[2593]: E0213 09:55:42.853492 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:44.853512 kubelet[2593]: E0213 09:55:44.853454 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:46.853540 kubelet[2593]: E0213 09:55:46.853483 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:48.853675 kubelet[2593]: E0213 09:55:48.853635 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:50.854033 kubelet[2593]: E0213 09:55:50.853941 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:51.877986 env[1473]: time="2024-02-13T09:55:51.877904912Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:51.878585 env[1473]: time="2024-02-13T09:55:51.878544319Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:51.880046 env[1473]: time="2024-02-13T09:55:51.880005221Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:51.881034 env[1473]: time="2024-02-13T09:55:51.880993940Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:b05edbd1f80db4ada229e6001a666a7dd36bb6ab617143684fb3d28abfc4b71e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:55:51.881566 env[1473]: time="2024-02-13T09:55:51.881521875Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\" returns image reference \"sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a\"" Feb 13 09:55:51.882471 env[1473]: time="2024-02-13T09:55:51.882455784Z" level=info msg="CreateContainer within sandbox \"dd7a8052e34dea3a5364f4620aefaf6e2242ec7c4b6ebf806a10a077d14a22c1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 09:55:51.886913 env[1473]: time="2024-02-13T09:55:51.886868665Z" level=info msg="CreateContainer within sandbox \"dd7a8052e34dea3a5364f4620aefaf6e2242ec7c4b6ebf806a10a077d14a22c1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"eaeb287c92e976c9cef11e687e236c6c67557a630d580a98c29677c19b51bf32\"" Feb 13 09:55:51.887100 env[1473]: time="2024-02-13T09:55:51.887057525Z" level=info msg="StartContainer for \"eaeb287c92e976c9cef11e687e236c6c67557a630d580a98c29677c19b51bf32\"" Feb 13 09:55:51.896122 systemd[1]: Started cri-containerd-eaeb287c92e976c9cef11e687e236c6c67557a630d580a98c29677c19b51bf32.scope. Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit[3488]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001176b0 a2=3c a3=7fc9845be5b8 items=0 ppid=3217 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:52.067345 kernel: audit: type=1400 audit(1707818151.902:1133): avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.067422 kernel: audit: type=1300 audit(1707818151.902:1133): arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001176b0 a2=3c a3=7fc9845be5b8 items=0 ppid=3217 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:52.067444 kernel: audit: type=1327 audit(1707818151.902:1133): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561656232383763393265393736633963656631316536383765323336 Feb 13 09:55:51.902000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561656232383763393265393736633963656631316536383765323336 Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.225691 kernel: audit: type=1400 audit(1707818151.902:1134): avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.225790 kernel: audit: type=1400 audit(1707818151.902:1134): avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.289873 kernel: audit: type=1400 audit(1707818151.902:1134): avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.295538 env[1473]: time="2024-02-13T09:55:52.295508784Z" level=info msg="StartContainer for \"eaeb287c92e976c9cef11e687e236c6c67557a630d580a98c29677c19b51bf32\" returns successfully" Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.354747 systemd[1]: cri-containerd-eaeb287c92e976c9cef11e687e236c6c67557a630d580a98c29677c19b51bf32.scope: Deactivated successfully. Feb 13 09:55:52.364770 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eaeb287c92e976c9cef11e687e236c6c67557a630d580a98c29677c19b51bf32-rootfs.mount: Deactivated successfully. Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.484000 kernel: audit: type=1400 audit(1707818151.902:1134): avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.484072 kernel: audit: type=1400 audit(1707818151.902:1134): avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.484089 kernel: audit: type=1400 audit(1707818151.902:1134): avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.613631 kernel: audit: type=1400 audit(1707818151.902:1134): avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.902000 audit: BPF prog-id=139 op=LOAD Feb 13 09:55:51.902000 audit[3488]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001179d8 a2=78 a3=c000404268 items=0 ppid=3217 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:51.902000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561656232383763393265393736633963656631316536383765323336 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:51.967000 audit: BPF prog-id=140 op=LOAD Feb 13 09:55:51.967000 audit[3488]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000117770 a2=78 a3=c0004042b8 items=0 ppid=3217 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:51.967000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561656232383763393265393736633963656631316536383765323336 Feb 13 09:55:52.160000 audit: BPF prog-id=140 op=UNLOAD Feb 13 09:55:52.160000 audit: BPF prog-id=139 op=UNLOAD Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { perfmon } for pid=3488 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit[3488]: AVC avc: denied { bpf } for pid=3488 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:55:52.160000 audit: BPF prog-id=141 op=LOAD Feb 13 09:55:52.160000 audit[3488]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000117c30 a2=78 a3=c000404348 items=0 ppid=3217 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:55:52.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561656232383763393265393736633963656631316536383765323336 Feb 13 09:55:52.615000 audit: BPF prog-id=141 op=UNLOAD Feb 13 09:55:52.645816 env[1473]: time="2024-02-13T09:55:52.645791942Z" level=info msg="shim disconnected" id=eaeb287c92e976c9cef11e687e236c6c67557a630d580a98c29677c19b51bf32 Feb 13 09:55:52.645884 env[1473]: time="2024-02-13T09:55:52.645817571Z" level=warning msg="cleaning up after shim disconnected" id=eaeb287c92e976c9cef11e687e236c6c67557a630d580a98c29677c19b51bf32 namespace=k8s.io Feb 13 09:55:52.645884 env[1473]: time="2024-02-13T09:55:52.645824128Z" level=info msg="cleaning up dead shim" Feb 13 09:55:52.649387 env[1473]: time="2024-02-13T09:55:52.649364965Z" level=warning msg="cleanup warnings time=\"2024-02-13T09:55:52Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3525 runtime=io.containerd.runc.v2\n" Feb 13 09:55:52.853641 kubelet[2593]: E0213 09:55:52.853440 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:52.987878 env[1473]: time="2024-02-13T09:55:52.987788361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\"" Feb 13 09:55:54.853749 kubelet[2593]: E0213 09:55:54.853681 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:55.202000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:55.202000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0014a7d60 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:55.202000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:55.202000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:55.202000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c000f623f0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:55:55.202000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:55:55.433000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:55.433000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0079c95f0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:55:55.433000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:55:55.433000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:55.433000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00341e5e0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:55:55.433000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:55:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0099f0480 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:55:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:55:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0099f04b0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:55:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:55:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00aafb260 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:55:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:55:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:55:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c002012520 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:55:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:55:56.061192 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2166230995.mount: Deactivated successfully. Feb 13 09:55:56.853659 kubelet[2593]: E0213 09:55:56.853607 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:55:58.853254 kubelet[2593]: E0213 09:55:58.853206 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:00.854180 kubelet[2593]: E0213 09:56:00.854081 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:02.853324 kubelet[2593]: E0213 09:56:02.853260 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:04.853221 kubelet[2593]: E0213 09:56:04.853167 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:06.853390 kubelet[2593]: E0213 09:56:06.853288 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:08.853823 kubelet[2593]: E0213 09:56:08.853779 2593 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:09.103172 env[1473]: time="2024-02-13T09:56:09.103116120Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:56:09.103779 env[1473]: time="2024-02-13T09:56:09.103724489Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:56:09.104843 env[1473]: time="2024-02-13T09:56:09.104790127Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:56:09.105817 env[1473]: time="2024-02-13T09:56:09.105803471Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:d943b4c23e82a39b0186a1a3b2fe8f728e543d503df72d7be521501a82b7e7b4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 09:56:09.106686 env[1473]: time="2024-02-13T09:56:09.106641096Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\" returns image reference \"sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93\"" Feb 13 09:56:09.127780 env[1473]: time="2024-02-13T09:56:09.127736667Z" level=info msg="CreateContainer within sandbox \"dd7a8052e34dea3a5364f4620aefaf6e2242ec7c4b6ebf806a10a077d14a22c1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 09:56:09.135182 env[1473]: time="2024-02-13T09:56:09.135161774Z" level=info msg="CreateContainer within sandbox \"dd7a8052e34dea3a5364f4620aefaf6e2242ec7c4b6ebf806a10a077d14a22c1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1225b4e3f7146442642e2312824c0284d8713116e8bf16d7f13b8a91f8b04c8b\"" Feb 13 09:56:09.135396 env[1473]: time="2024-02-13T09:56:09.135384759Z" level=info msg="StartContainer for \"1225b4e3f7146442642e2312824c0284d8713116e8bf16d7f13b8a91f8b04c8b\"" Feb 13 09:56:09.146292 systemd[1]: Started cri-containerd-1225b4e3f7146442642e2312824c0284d8713116e8bf16d7f13b8a91f8b04c8b.scope. Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.180715 kernel: kauditd_printk_skb: 58 callbacks suppressed Feb 13 09:56:09.180758 kernel: audit: type=1400 audit(1707818169.151:1148): avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001bd6b0 a2=3c a3=7f07a4d86858 items=0 ppid=3217 pid=3551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:56:09.344697 kernel: audit: type=1300 audit(1707818169.151:1148): arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001bd6b0 a2=3c a3=7f07a4d86858 items=0 ppid=3217 pid=3551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:56:09.344751 kernel: audit: type=1327 audit(1707818169.151:1148): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323562346533663731343634343236343265323331323832346330 Feb 13 09:56:09.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323562346533663731343634343236343265323331323832346330 Feb 13 09:56:09.439238 kernel: audit: type=1400 audit(1707818169.151:1149): avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.567649 kernel: audit: type=1400 audit(1707818169.151:1149): avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.567684 kernel: audit: type=1400 audit(1707818169.151:1149): avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.632058 kernel: audit: type=1400 audit(1707818169.151:1149): avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.707822 env[1473]: time="2024-02-13T09:56:09.707762290Z" level=info msg="StartContainer for \"1225b4e3f7146442642e2312824c0284d8713116e8bf16d7f13b8a91f8b04c8b\" returns successfully" Feb 13 09:56:09.761533 kernel: audit: type=1400 audit(1707818169.151:1149): avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.761597 kernel: audit: type=1400 audit(1707818169.151:1149): avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.826285 kernel: audit: type=1400 audit(1707818169.151:1149): avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.151000 audit: BPF prog-id=142 op=LOAD Feb 13 09:56:09.151000 audit[3551]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001bd9d8 a2=78 a3=c0003ce418 items=0 ppid=3217 pid=3551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:56:09.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323562346533663731343634343236343265323331323832346330 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.343000 audit: BPF prog-id=143 op=LOAD Feb 13 09:56:09.343000 audit[3551]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c0001bd770 a2=78 a3=c0003ce468 items=0 ppid=3217 pid=3551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:56:09.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323562346533663731343634343236343265323331323832346330 Feb 13 09:56:09.501000 audit: BPF prog-id=143 op=UNLOAD Feb 13 09:56:09.501000 audit: BPF prog-id=142 op=UNLOAD Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { perfmon } for pid=3551 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.609000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:09.609000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001989040 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:56:09.609000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:56:09.611000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:09.611000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001989060 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:56:09.611000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:56:09.613000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:09.613000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006f4580 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:56:09.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:56:09.614000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:09.614000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00133df60 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:56:09.614000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:56:09.501000 audit[3551]: AVC avc: denied { bpf } for pid=3551 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 09:56:09.501000 audit: BPF prog-id=144 op=LOAD Feb 13 09:56:09.501000 audit[3551]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001bdc30 a2=78 a3=c0003ce4f8 items=0 ppid=3217 pid=3551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 09:56:09.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323562346533663731343634343236343265323331323832346330 Feb 13 09:56:10.314960 env[1473]: time="2024-02-13T09:56:10.314890751Z" level=error msg="failed to reload cni configuration after receiving fs change event(\"/etc/cni/net.d/calico-kubeconfig\": WRITE)" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 09:56:10.316434 systemd[1]: cri-containerd-1225b4e3f7146442642e2312824c0284d8713116e8bf16d7f13b8a91f8b04c8b.scope: Deactivated successfully. Feb 13 09:56:10.328413 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1225b4e3f7146442642e2312824c0284d8713116e8bf16d7f13b8a91f8b04c8b-rootfs.mount: Deactivated successfully. Feb 13 09:56:10.328000 audit: BPF prog-id=144 op=UNLOAD Feb 13 09:56:10.383662 kubelet[2593]: I0213 09:56:10.380413 2593 kubelet_node_status.go:493] "Fast updating node status as it just became ready" Feb 13 09:56:10.418947 kubelet[2593]: I0213 09:56:10.418881 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:56:10.419799 kubelet[2593]: I0213 09:56:10.419742 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:56:10.420460 kubelet[2593]: I0213 09:56:10.420411 2593 topology_manager.go:210] "Topology Admit Handler" Feb 13 09:56:10.432727 systemd[1]: Created slice kubepods-burstable-podfe6819ac_25fb_455a_b6b5_7432acf1219d.slice. Feb 13 09:56:10.445095 systemd[1]: Created slice kubepods-burstable-podac15c9fc_cc5d_4a8f_ac09_16f6497ee733.slice. Feb 13 09:56:10.454890 systemd[1]: Created slice kubepods-besteffort-pod18384425_4aba_475c_a64f_6bfe3101b275.slice. Feb 13 09:56:10.555591 kubelet[2593]: I0213 09:56:10.555486 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18384425-4aba-475c-a64f-6bfe3101b275-tigera-ca-bundle\") pod \"calico-kube-controllers-86cd8c4979-2tlsw\" (UID: \"18384425-4aba-475c-a64f-6bfe3101b275\") " pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" Feb 13 09:56:10.555591 kubelet[2593]: I0213 09:56:10.555594 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjk9n\" (UniqueName: \"kubernetes.io/projected/18384425-4aba-475c-a64f-6bfe3101b275-kube-api-access-hjk9n\") pod \"calico-kube-controllers-86cd8c4979-2tlsw\" (UID: \"18384425-4aba-475c-a64f-6bfe3101b275\") " pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" Feb 13 09:56:10.556145 kubelet[2593]: I0213 09:56:10.556006 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe6819ac-25fb-455a-b6b5-7432acf1219d-config-volume\") pod \"coredns-787d4945fb-sv24x\" (UID: \"fe6819ac-25fb-455a-b6b5-7432acf1219d\") " pod="kube-system/coredns-787d4945fb-sv24x" Feb 13 09:56:10.556145 kubelet[2593]: I0213 09:56:10.556081 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac15c9fc-cc5d-4a8f-ac09-16f6497ee733-config-volume\") pod \"coredns-787d4945fb-zxn6w\" (UID: \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\") " pod="kube-system/coredns-787d4945fb-zxn6w" Feb 13 09:56:10.556392 kubelet[2593]: I0213 09:56:10.556168 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q55zk\" (UniqueName: \"kubernetes.io/projected/ac15c9fc-cc5d-4a8f-ac09-16f6497ee733-kube-api-access-q55zk\") pod \"coredns-787d4945fb-zxn6w\" (UID: \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\") " pod="kube-system/coredns-787d4945fb-zxn6w" Feb 13 09:56:10.556392 kubelet[2593]: I0213 09:56:10.556279 2593 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl2rt\" (UniqueName: \"kubernetes.io/projected/fe6819ac-25fb-455a-b6b5-7432acf1219d-kube-api-access-rl2rt\") pod \"coredns-787d4945fb-sv24x\" (UID: \"fe6819ac-25fb-455a-b6b5-7432acf1219d\") " pod="kube-system/coredns-787d4945fb-sv24x" Feb 13 09:56:10.739076 env[1473]: time="2024-02-13T09:56:10.738961000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-sv24x,Uid:fe6819ac-25fb-455a-b6b5-7432acf1219d,Namespace:kube-system,Attempt:0,}" Feb 13 09:56:10.778797 env[1473]: time="2024-02-13T09:56:10.778714118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86cd8c4979-2tlsw,Uid:18384425-4aba-475c-a64f-6bfe3101b275,Namespace:calico-system,Attempt:0,}" Feb 13 09:56:10.779119 env[1473]: time="2024-02-13T09:56:10.778759924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-zxn6w,Uid:ac15c9fc-cc5d-4a8f-ac09-16f6497ee733,Namespace:kube-system,Attempt:0,}" Feb 13 09:56:10.815319 env[1473]: time="2024-02-13T09:56:10.815182571Z" level=info msg="shim disconnected" id=1225b4e3f7146442642e2312824c0284d8713116e8bf16d7f13b8a91f8b04c8b Feb 13 09:56:10.815319 env[1473]: time="2024-02-13T09:56:10.815283365Z" level=warning msg="cleaning up after shim disconnected" id=1225b4e3f7146442642e2312824c0284d8713116e8bf16d7f13b8a91f8b04c8b namespace=k8s.io Feb 13 09:56:10.815319 env[1473]: time="2024-02-13T09:56:10.815315880Z" level=info msg="cleaning up dead shim" Feb 13 09:56:10.833455 env[1473]: time="2024-02-13T09:56:10.833310238Z" level=warning msg="cleanup warnings time=\"2024-02-13T09:56:10Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3616 runtime=io.containerd.runc.v2\n" Feb 13 09:56:10.859212 systemd[1]: Created slice kubepods-besteffort-pod70a6a2a2_80be_4700_bde4_cdae2bf45250.slice. Feb 13 09:56:10.861518 env[1473]: time="2024-02-13T09:56:10.861460386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w8xgk,Uid:70a6a2a2-80be-4700-bde4-cdae2bf45250,Namespace:calico-system,Attempt:0,}" Feb 13 09:56:10.882732 env[1473]: time="2024-02-13T09:56:10.882674999Z" level=error msg="Failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.882875 env[1473]: time="2024-02-13T09:56:10.882845595Z" level=error msg="Failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.882979 env[1473]: time="2024-02-13T09:56:10.882961165Z" level=error msg="encountered an error cleaning up failed sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.883011 env[1473]: time="2024-02-13T09:56:10.882997059Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-zxn6w,Uid:ac15c9fc-cc5d-4a8f-ac09-16f6497ee733,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.883039 env[1473]: time="2024-02-13T09:56:10.883022811Z" level=error msg="encountered an error cleaning up failed sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.883065 env[1473]: time="2024-02-13T09:56:10.883053087Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-sv24x,Uid:fe6819ac-25fb-455a-b6b5-7432acf1219d,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.883193 kubelet[2593]: E0213 09:56:10.883180 2593 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.883232 kubelet[2593]: E0213 09:56:10.883226 2593 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-zxn6w" Feb 13 09:56:10.883260 kubelet[2593]: E0213 09:56:10.883242 2593 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-zxn6w" Feb 13 09:56:10.883260 kubelet[2593]: E0213 09:56:10.883180 2593 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.883313 kubelet[2593]: E0213 09:56:10.883274 2593 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-sv24x" Feb 13 09:56:10.883313 kubelet[2593]: E0213 09:56:10.883280 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-787d4945fb-zxn6w_kube-system(ac15c9fc-cc5d-4a8f-ac09-16f6497ee733)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-787d4945fb-zxn6w_kube-system(ac15c9fc-cc5d-4a8f-ac09-16f6497ee733)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:56:10.883313 kubelet[2593]: E0213 09:56:10.883291 2593 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-sv24x" Feb 13 09:56:10.883455 kubelet[2593]: E0213 09:56:10.883321 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-787d4945fb-sv24x_kube-system(fe6819ac-25fb-455a-b6b5-7432acf1219d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-787d4945fb-sv24x_kube-system(fe6819ac-25fb-455a-b6b5-7432acf1219d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:56:10.884305 env[1473]: time="2024-02-13T09:56:10.884283324Z" level=error msg="Failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.884508 env[1473]: time="2024-02-13T09:56:10.884491390Z" level=error msg="encountered an error cleaning up failed sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.884541 env[1473]: time="2024-02-13T09:56:10.884519458Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86cd8c4979-2tlsw,Uid:18384425-4aba-475c-a64f-6bfe3101b275,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.884636 kubelet[2593]: E0213 09:56:10.884624 2593 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.884665 kubelet[2593]: E0213 09:56:10.884657 2593 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" Feb 13 09:56:10.884693 kubelet[2593]: E0213 09:56:10.884672 2593 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" Feb 13 09:56:10.884719 kubelet[2593]: E0213 09:56:10.884699 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86cd8c4979-2tlsw_calico-system(18384425-4aba-475c-a64f-6bfe3101b275)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86cd8c4979-2tlsw_calico-system(18384425-4aba-475c-a64f-6bfe3101b275)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:56:10.894257 env[1473]: time="2024-02-13T09:56:10.894190666Z" level=error msg="Failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.894442 env[1473]: time="2024-02-13T09:56:10.894398238Z" level=error msg="encountered an error cleaning up failed sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.894442 env[1473]: time="2024-02-13T09:56:10.894429942Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w8xgk,Uid:70a6a2a2-80be-4700-bde4-cdae2bf45250,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.894599 kubelet[2593]: E0213 09:56:10.894560 2593 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:10.894599 kubelet[2593]: E0213 09:56:10.894590 2593 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-w8xgk" Feb 13 09:56:10.894674 kubelet[2593]: E0213 09:56:10.894604 2593 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-w8xgk" Feb 13 09:56:10.894674 kubelet[2593]: E0213 09:56:10.894633 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-w8xgk_calico-system(70a6a2a2-80be-4700-bde4-cdae2bf45250)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-w8xgk_calico-system(70a6a2a2-80be-4700-bde4-cdae2bf45250)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:11.052721 env[1473]: time="2024-02-13T09:56:11.052493272Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.27.0\"" Feb 13 09:56:11.053028 kubelet[2593]: I0213 09:56:11.052553 2593 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:56:11.054044 env[1473]: time="2024-02-13T09:56:11.053978913Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:56:11.054627 kubelet[2593]: I0213 09:56:11.054582 2593 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:56:11.055595 env[1473]: time="2024-02-13T09:56:11.055517346Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:56:11.056807 kubelet[2593]: I0213 09:56:11.056758 2593 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:56:11.058056 env[1473]: time="2024-02-13T09:56:11.057976030Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:56:11.059058 kubelet[2593]: I0213 09:56:11.059014 2593 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:56:11.060557 env[1473]: time="2024-02-13T09:56:11.060449930Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:56:11.101787 env[1473]: time="2024-02-13T09:56:11.101708923Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:11.101787 env[1473]: time="2024-02-13T09:56:11.101713511Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:11.102016 env[1473]: time="2024-02-13T09:56:11.101837692Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:11.102087 kubelet[2593]: E0213 09:56:11.101995 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:56:11.102087 kubelet[2593]: E0213 09:56:11.102000 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:56:11.102087 kubelet[2593]: E0213 09:56:11.101996 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:56:11.102087 kubelet[2593]: E0213 09:56:11.102063 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:56:11.102087 kubelet[2593]: E0213 09:56:11.102066 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:56:11.102330 kubelet[2593]: E0213 09:56:11.102102 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:11.102330 kubelet[2593]: E0213 09:56:11.102121 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:11.102330 kubelet[2593]: E0213 09:56:11.102131 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:56:11.102330 kubelet[2593]: E0213 09:56:11.102155 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:56:11.102578 kubelet[2593]: E0213 09:56:11.102167 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:56:11.102578 kubelet[2593]: E0213 09:56:11.102208 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:11.102578 kubelet[2593]: E0213 09:56:11.102235 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:56:11.102882 env[1473]: time="2024-02-13T09:56:11.102822905Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:11.102997 kubelet[2593]: E0213 09:56:11.102968 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:56:11.102997 kubelet[2593]: E0213 09:56:11.102993 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:56:11.103093 kubelet[2593]: E0213 09:56:11.103024 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:11.103093 kubelet[2593]: E0213 09:56:11.103048 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:11.333334 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654-shm.mount: Deactivated successfully. Feb 13 09:56:11.333685 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768-shm.mount: Deactivated successfully. Feb 13 09:56:11.333972 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2-shm.mount: Deactivated successfully. Feb 13 09:56:21.853115 env[1473]: time="2024-02-13T09:56:21.853084508Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:56:21.868363 env[1473]: time="2024-02-13T09:56:21.868316126Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:21.868524 kubelet[2593]: E0213 09:56:21.868510 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:56:21.868713 kubelet[2593]: E0213 09:56:21.868539 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:56:21.868713 kubelet[2593]: E0213 09:56:21.868565 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:21.868713 kubelet[2593]: E0213 09:56:21.868585 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:56:22.854938 env[1473]: time="2024-02-13T09:56:22.854783839Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:56:22.906752 env[1473]: time="2024-02-13T09:56:22.906652638Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:22.906927 kubelet[2593]: E0213 09:56:22.906905 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:56:22.907282 kubelet[2593]: E0213 09:56:22.906958 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:56:22.907282 kubelet[2593]: E0213 09:56:22.907007 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:22.907282 kubelet[2593]: E0213 09:56:22.907047 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:25.854667 env[1473]: time="2024-02-13T09:56:25.854554450Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:56:25.855558 env[1473]: time="2024-02-13T09:56:25.854719136Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:56:25.881180 env[1473]: time="2024-02-13T09:56:25.881076066Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:25.881313 env[1473]: time="2024-02-13T09:56:25.881194791Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:25.881375 kubelet[2593]: E0213 09:56:25.881306 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:56:25.881375 kubelet[2593]: E0213 09:56:25.881341 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:56:25.881375 kubelet[2593]: E0213 09:56:25.881308 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:56:25.881375 kubelet[2593]: E0213 09:56:25.881376 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:56:25.881600 kubelet[2593]: E0213 09:56:25.881377 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:25.881600 kubelet[2593]: E0213 09:56:25.881399 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:56:25.881600 kubelet[2593]: E0213 09:56:25.881403 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:25.881702 kubelet[2593]: E0213 09:56:25.881426 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:56:33.855078 env[1473]: time="2024-02-13T09:56:33.854940374Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:56:33.855078 env[1473]: time="2024-02-13T09:56:33.855013020Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:56:33.882764 env[1473]: time="2024-02-13T09:56:33.882729870Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:33.882764 env[1473]: time="2024-02-13T09:56:33.882741151Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:33.882900 kubelet[2593]: E0213 09:56:33.882826 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:56:33.882900 kubelet[2593]: E0213 09:56:33.882850 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:56:33.882900 kubelet[2593]: E0213 09:56:33.882872 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:33.882900 kubelet[2593]: E0213 09:56:33.882889 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:33.883180 kubelet[2593]: E0213 09:56:33.882826 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:56:33.883180 kubelet[2593]: E0213 09:56:33.882910 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:56:33.883180 kubelet[2593]: E0213 09:56:33.882937 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:33.883180 kubelet[2593]: E0213 09:56:33.882960 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:56:35.851566 update_engine[1463]: I0213 09:56:35.851449 1463 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Feb 13 09:56:35.851566 update_engine[1463]: I0213 09:56:35.851529 1463 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Feb 13 09:56:35.852525 update_engine[1463]: I0213 09:56:35.852198 1463 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Feb 13 09:56:35.853156 update_engine[1463]: I0213 09:56:35.853076 1463 omaha_request_params.cc:62] Current group set to lts Feb 13 09:56:35.853421 update_engine[1463]: I0213 09:56:35.853384 1463 update_attempter.cc:499] Already updated boot flags. Skipping. Feb 13 09:56:35.853421 update_engine[1463]: I0213 09:56:35.853410 1463 update_attempter.cc:643] Scheduling an action processor start. Feb 13 09:56:35.853764 update_engine[1463]: I0213 09:56:35.853449 1463 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Feb 13 09:56:35.853764 update_engine[1463]: I0213 09:56:35.853515 1463 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Feb 13 09:56:35.853764 update_engine[1463]: I0213 09:56:35.853654 1463 omaha_request_action.cc:270] Posting an Omaha request to disabled Feb 13 09:56:35.853764 update_engine[1463]: I0213 09:56:35.853670 1463 omaha_request_action.cc:271] Request: Feb 13 09:56:35.853764 update_engine[1463]: Feb 13 09:56:35.853764 update_engine[1463]: Feb 13 09:56:35.853764 update_engine[1463]: Feb 13 09:56:35.853764 update_engine[1463]: Feb 13 09:56:35.853764 update_engine[1463]: Feb 13 09:56:35.853764 update_engine[1463]: Feb 13 09:56:35.853764 update_engine[1463]: Feb 13 09:56:35.853764 update_engine[1463]: Feb 13 09:56:35.853764 update_engine[1463]: I0213 09:56:35.853680 1463 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 09:56:35.854995 locksmithd[1507]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Feb 13 09:56:35.856758 update_engine[1463]: I0213 09:56:35.856665 1463 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 09:56:35.857027 update_engine[1463]: E0213 09:56:35.856886 1463 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 09:56:35.857230 update_engine[1463]: I0213 09:56:35.857040 1463 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Feb 13 09:56:36.855118 env[1473]: time="2024-02-13T09:56:36.854996824Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:56:36.881658 env[1473]: time="2024-02-13T09:56:36.881604730Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:36.881860 kubelet[2593]: E0213 09:56:36.881847 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:56:36.882033 kubelet[2593]: E0213 09:56:36.881879 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:56:36.882033 kubelet[2593]: E0213 09:56:36.881912 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:36.882033 kubelet[2593]: E0213 09:56:36.881938 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:56:38.854001 env[1473]: time="2024-02-13T09:56:38.853973985Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:56:38.867645 env[1473]: time="2024-02-13T09:56:38.867574552Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:38.867767 kubelet[2593]: E0213 09:56:38.867752 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:56:38.867961 kubelet[2593]: E0213 09:56:38.867784 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:56:38.867961 kubelet[2593]: E0213 09:56:38.867813 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:38.867961 kubelet[2593]: E0213 09:56:38.867836 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:56:45.836257 update_engine[1463]: I0213 09:56:45.836140 1463 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 09:56:45.837180 update_engine[1463]: I0213 09:56:45.836632 1463 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 09:56:45.837180 update_engine[1463]: E0213 09:56:45.836833 1463 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 09:56:45.837180 update_engine[1463]: I0213 09:56:45.837002 1463 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Feb 13 09:56:47.854465 env[1473]: time="2024-02-13T09:56:47.854364191Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:56:47.905244 env[1473]: time="2024-02-13T09:56:47.905143301Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:47.905478 kubelet[2593]: E0213 09:56:47.905453 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:56:47.905902 kubelet[2593]: E0213 09:56:47.905503 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:56:47.905902 kubelet[2593]: E0213 09:56:47.905556 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:47.905902 kubelet[2593]: E0213 09:56:47.905598 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:56:48.854808 env[1473]: time="2024-02-13T09:56:48.854722406Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:56:48.903291 env[1473]: time="2024-02-13T09:56:48.903236274Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:48.903537 kubelet[2593]: E0213 09:56:48.903516 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:56:48.903623 kubelet[2593]: E0213 09:56:48.903560 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:56:48.903623 kubelet[2593]: E0213 09:56:48.903603 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:48.903753 kubelet[2593]: E0213 09:56:48.903636 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:56:50.855250 env[1473]: time="2024-02-13T09:56:50.855164464Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:56:50.904185 env[1473]: time="2024-02-13T09:56:50.904125739Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:50.904392 kubelet[2593]: E0213 09:56:50.904368 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:56:50.904707 kubelet[2593]: E0213 09:56:50.904413 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:56:50.904707 kubelet[2593]: E0213 09:56:50.904456 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:50.904707 kubelet[2593]: E0213 09:56:50.904491 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:56:51.854152 env[1473]: time="2024-02-13T09:56:51.854050760Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:56:51.905721 env[1473]: time="2024-02-13T09:56:51.905609948Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:56:51.906176 kubelet[2593]: E0213 09:56:51.905913 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:56:51.906176 kubelet[2593]: E0213 09:56:51.905960 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:56:51.906176 kubelet[2593]: E0213 09:56:51.906012 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:56:51.906176 kubelet[2593]: E0213 09:56:51.906052 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:56:55.204000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.232052 kernel: kauditd_printk_skb: 46 callbacks suppressed Feb 13 09:56:55.232252 kernel: audit: type=1400 audit(1707818215.204:1159): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.204000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002760280 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:56:55.449210 kernel: audit: type=1300 audit(1707818215.204:1159): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002760280 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:56:55.449240 kernel: audit: type=1327 audit(1707818215.204:1159): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:56:55.204000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:56:55.543848 kernel: audit: type=1400 audit(1707818215.204:1160): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.204000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.636023 kernel: audit: type=1300 audit(1707818215.204:1160): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000fe1800 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:56:55.204000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000fe1800 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:56:55.757481 kernel: audit: type=1327 audit(1707818215.204:1160): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:56:55.204000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:56:55.836032 update_engine[1463]: I0213 09:56:55.835971 1463 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 09:56:55.836179 update_engine[1463]: I0213 09:56:55.836071 1463 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 09:56:55.836179 update_engine[1463]: E0213 09:56:55.836118 1463 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 09:56:55.836179 update_engine[1463]: I0213 09:56:55.836153 1463 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Feb 13 09:56:55.851487 kernel: audit: type=1400 audit(1707818215.435:1161): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.943303 kernel: audit: type=1300 audit(1707818215.435:1161): arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00176f9b0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:56:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00176f9b0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:56:56.042579 kernel: audit: type=1327 audit(1707818215.435:1161): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:56:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:56:56.136759 kernel: audit: type=1400 audit(1707818215.435:1162): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00057ebc0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:56:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:56:55.436000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.436000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00057ec00 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:56:55.436000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:56:55.436000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.436000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c00176fa10 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:56:55.436000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:56:55.436000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.436000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c004946d20 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:56:55.436000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:56:55.436000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:56:55.436000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0127aa5d0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:56:55.436000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:57:01.855305 env[1473]: time="2024-02-13T09:57:01.855175433Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:57:01.855305 env[1473]: time="2024-02-13T09:57:01.855178384Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:57:01.882132 env[1473]: time="2024-02-13T09:57:01.882044673Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:01.882326 kubelet[2593]: E0213 09:57:01.882314 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:57:01.882603 kubelet[2593]: E0213 09:57:01.882353 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:57:01.882603 kubelet[2593]: E0213 09:57:01.882407 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:01.882603 kubelet[2593]: E0213 09:57:01.882442 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:57:01.882603 kubelet[2593]: E0213 09:57:01.882505 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:57:01.882603 kubelet[2593]: E0213 09:57:01.882549 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:57:01.882740 env[1473]: time="2024-02-13T09:57:01.882354665Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:01.882764 kubelet[2593]: E0213 09:57:01.882567 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:01.882764 kubelet[2593]: E0213 09:57:01.882580 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:57:03.855188 env[1473]: time="2024-02-13T09:57:03.855057396Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:57:03.881118 env[1473]: time="2024-02-13T09:57:03.881084438Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:03.881244 kubelet[2593]: E0213 09:57:03.881234 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:57:03.881432 kubelet[2593]: E0213 09:57:03.881260 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:57:03.881432 kubelet[2593]: E0213 09:57:03.881282 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:03.881432 kubelet[2593]: E0213 09:57:03.881298 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:57:05.827542 update_engine[1463]: I0213 09:57:05.827414 1463 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 09:57:05.828316 update_engine[1463]: I0213 09:57:05.827862 1463 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 09:57:05.828316 update_engine[1463]: E0213 09:57:05.828064 1463 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 09:57:05.828316 update_engine[1463]: I0213 09:57:05.828210 1463 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Feb 13 09:57:05.828316 update_engine[1463]: I0213 09:57:05.828224 1463 omaha_request_action.cc:621] Omaha request response: Feb 13 09:57:05.828769 update_engine[1463]: E0213 09:57:05.828395 1463 omaha_request_action.cc:640] Omaha request network transfer failed. Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828424 1463 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828434 1463 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828442 1463 update_attempter.cc:306] Processing Done. Feb 13 09:57:05.828769 update_engine[1463]: E0213 09:57:05.828466 1463 update_attempter.cc:619] Update failed. Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828475 1463 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828485 1463 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828493 1463 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828644 1463 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828693 1463 omaha_request_action.cc:270] Posting an Omaha request to disabled Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828703 1463 omaha_request_action.cc:271] Request: Feb 13 09:57:05.828769 update_engine[1463]: Feb 13 09:57:05.828769 update_engine[1463]: Feb 13 09:57:05.828769 update_engine[1463]: Feb 13 09:57:05.828769 update_engine[1463]: Feb 13 09:57:05.828769 update_engine[1463]: Feb 13 09:57:05.828769 update_engine[1463]: Feb 13 09:57:05.828769 update_engine[1463]: I0213 09:57:05.828713 1463 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 09:57:05.830666 update_engine[1463]: I0213 09:57:05.829020 1463 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 09:57:05.830666 update_engine[1463]: E0213 09:57:05.829180 1463 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 09:57:05.830666 update_engine[1463]: I0213 09:57:05.829311 1463 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Feb 13 09:57:05.830666 update_engine[1463]: I0213 09:57:05.829325 1463 omaha_request_action.cc:621] Omaha request response: Feb 13 09:57:05.830666 update_engine[1463]: I0213 09:57:05.829350 1463 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 09:57:05.830666 update_engine[1463]: I0213 09:57:05.829360 1463 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 09:57:05.830666 update_engine[1463]: I0213 09:57:05.829367 1463 update_attempter.cc:306] Processing Done. Feb 13 09:57:05.830666 update_engine[1463]: I0213 09:57:05.829375 1463 update_attempter.cc:310] Error event sent. Feb 13 09:57:05.830666 update_engine[1463]: I0213 09:57:05.829396 1463 update_check_scheduler.cc:74] Next update check in 48m43s Feb 13 09:57:05.831562 locksmithd[1507]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Feb 13 09:57:05.831562 locksmithd[1507]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Feb 13 09:57:05.855149 env[1473]: time="2024-02-13T09:57:05.855008850Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:57:05.881270 env[1473]: time="2024-02-13T09:57:05.881229652Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:05.881499 kubelet[2593]: E0213 09:57:05.881452 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:57:05.881499 kubelet[2593]: E0213 09:57:05.881480 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:57:05.881499 kubelet[2593]: E0213 09:57:05.881501 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:05.881719 kubelet[2593]: E0213 09:57:05.881518 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:57:09.610000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:09.638053 kernel: kauditd_printk_skb: 14 callbacks suppressed Feb 13 09:57:09.638098 kernel: audit: type=1400 audit(1707818229.610:1167): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:09.610000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002267820 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:09.851719 kernel: audit: type=1300 audit(1707818229.610:1167): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002267820 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:09.851748 kernel: audit: type=1327 audit(1707818229.610:1167): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:09.610000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:09.945827 kernel: audit: type=1400 audit(1707818229.613:1168): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:09.613000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:10.036272 kernel: audit: type=1300 audit(1707818229.613:1168): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e06500 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:09.613000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e06500 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:09.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:10.250972 kernel: audit: type=1327 audit(1707818229.613:1168): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:10.251004 kernel: audit: type=1400 audit(1707818229.615:1169): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:09.615000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:10.341577 kernel: audit: type=1300 audit(1707818229.615:1169): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00186ffc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:09.615000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00186ffc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:10.462625 kernel: audit: type=1327 audit(1707818229.615:1169): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:09.615000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:09.616000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:10.647771 kernel: audit: type=1400 audit(1707818229.616:1170): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:09.616000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00278d580 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:09.616000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:12.854984 env[1473]: time="2024-02-13T09:57:12.854854406Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:57:12.881885 env[1473]: time="2024-02-13T09:57:12.881827301Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:12.882079 kubelet[2593]: E0213 09:57:12.882067 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:57:12.882262 kubelet[2593]: E0213 09:57:12.882095 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:57:12.882262 kubelet[2593]: E0213 09:57:12.882126 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:12.882262 kubelet[2593]: E0213 09:57:12.882152 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:57:14.854213 env[1473]: time="2024-02-13T09:57:14.854108041Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:57:14.883978 env[1473]: time="2024-02-13T09:57:14.883942794Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:14.884139 kubelet[2593]: E0213 09:57:14.884112 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:57:14.884300 kubelet[2593]: E0213 09:57:14.884140 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:57:14.884300 kubelet[2593]: E0213 09:57:14.884162 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:14.884300 kubelet[2593]: E0213 09:57:14.884182 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:57:16.854780 env[1473]: time="2024-02-13T09:57:16.854655912Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:57:16.905777 env[1473]: time="2024-02-13T09:57:16.905708086Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:16.905907 kubelet[2593]: E0213 09:57:16.905871 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:57:16.905907 kubelet[2593]: E0213 09:57:16.905902 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:57:16.906085 kubelet[2593]: E0213 09:57:16.905924 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:16.906085 kubelet[2593]: E0213 09:57:16.905943 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:57:20.854703 env[1473]: time="2024-02-13T09:57:20.854601790Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:57:20.893861 env[1473]: time="2024-02-13T09:57:20.893802295Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:20.894080 kubelet[2593]: E0213 09:57:20.894060 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:57:20.894400 kubelet[2593]: E0213 09:57:20.894107 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:57:20.894400 kubelet[2593]: E0213 09:57:20.894156 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:20.894400 kubelet[2593]: E0213 09:57:20.894191 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:57:25.854786 env[1473]: time="2024-02-13T09:57:25.854603672Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:57:25.854786 env[1473]: time="2024-02-13T09:57:25.854631213Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:57:25.876993 env[1473]: time="2024-02-13T09:57:25.876956328Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:25.877106 env[1473]: time="2024-02-13T09:57:25.876957562Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:25.877138 kubelet[2593]: E0213 09:57:25.877124 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:57:25.877310 kubelet[2593]: E0213 09:57:25.877136 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:57:25.877310 kubelet[2593]: E0213 09:57:25.877157 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:57:25.877310 kubelet[2593]: E0213 09:57:25.877157 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:57:25.877310 kubelet[2593]: E0213 09:57:25.877181 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:25.877310 kubelet[2593]: E0213 09:57:25.877182 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:25.877463 kubelet[2593]: E0213 09:57:25.877201 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:57:25.877463 kubelet[2593]: E0213 09:57:25.877202 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:57:30.854883 env[1473]: time="2024-02-13T09:57:30.854786283Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:57:30.904592 env[1473]: time="2024-02-13T09:57:30.904508848Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:30.904859 kubelet[2593]: E0213 09:57:30.904801 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:57:30.905172 kubelet[2593]: E0213 09:57:30.904871 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:57:30.905172 kubelet[2593]: E0213 09:57:30.904928 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:30.905172 kubelet[2593]: E0213 09:57:30.904963 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:57:33.855351 env[1473]: time="2024-02-13T09:57:33.855192359Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:57:33.882641 env[1473]: time="2024-02-13T09:57:33.882573216Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:33.882856 kubelet[2593]: E0213 09:57:33.882841 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:57:33.883027 kubelet[2593]: E0213 09:57:33.882866 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:57:33.883027 kubelet[2593]: E0213 09:57:33.882890 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:33.883027 kubelet[2593]: E0213 09:57:33.882907 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:57:36.855377 env[1473]: time="2024-02-13T09:57:36.855282856Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:57:36.881132 env[1473]: time="2024-02-13T09:57:36.881096651Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:36.881310 kubelet[2593]: E0213 09:57:36.881299 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:57:36.881525 kubelet[2593]: E0213 09:57:36.881327 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:57:36.881525 kubelet[2593]: E0213 09:57:36.881376 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:36.881525 kubelet[2593]: E0213 09:57:36.881422 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:57:38.854780 env[1473]: time="2024-02-13T09:57:38.854716161Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:57:38.902953 env[1473]: time="2024-02-13T09:57:38.902889238Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:38.903137 kubelet[2593]: E0213 09:57:38.903118 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:57:38.903516 kubelet[2593]: E0213 09:57:38.903163 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:57:38.903516 kubelet[2593]: E0213 09:57:38.903216 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:38.903516 kubelet[2593]: E0213 09:57:38.903255 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:57:46.854936 env[1473]: time="2024-02-13T09:57:46.854759591Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:57:46.884576 env[1473]: time="2024-02-13T09:57:46.884536917Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:46.884832 kubelet[2593]: E0213 09:57:46.884798 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:57:46.885018 kubelet[2593]: E0213 09:57:46.884839 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:57:46.885018 kubelet[2593]: E0213 09:57:46.884860 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:46.885018 kubelet[2593]: E0213 09:57:46.884878 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:57:47.854955 env[1473]: time="2024-02-13T09:57:47.854865033Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:57:47.908688 env[1473]: time="2024-02-13T09:57:47.908517329Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:47.909065 kubelet[2593]: E0213 09:57:47.908947 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:57:47.909065 kubelet[2593]: E0213 09:57:47.909062 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:57:47.909860 kubelet[2593]: E0213 09:57:47.909152 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:47.909860 kubelet[2593]: E0213 09:57:47.909222 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:57:49.854710 env[1473]: time="2024-02-13T09:57:49.854631541Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:57:49.854710 env[1473]: time="2024-02-13T09:57:49.854633052Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:57:49.899196 env[1473]: time="2024-02-13T09:57:49.899136303Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:49.899390 kubelet[2593]: E0213 09:57:49.899359 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:57:49.899829 kubelet[2593]: E0213 09:57:49.899403 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:57:49.899829 kubelet[2593]: E0213 09:57:49.899446 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:49.899829 kubelet[2593]: E0213 09:57:49.899486 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:57:49.900908 env[1473]: time="2024-02-13T09:57:49.900846672Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:49.901030 kubelet[2593]: E0213 09:57:49.900995 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:57:49.901030 kubelet[2593]: E0213 09:57:49.901018 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:57:49.901121 kubelet[2593]: E0213 09:57:49.901050 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:49.901121 kubelet[2593]: E0213 09:57:49.901076 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:57:55.203000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.232313 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 09:57:55.232405 kernel: audit: type=1400 audit(1707818275.203:1171): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.203000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e07500 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:55.443072 kernel: audit: type=1300 audit(1707818275.203:1171): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e07500 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:55.443148 kernel: audit: type=1327 audit(1707818275.203:1171): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:55.203000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:55.204000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.629146 kernel: audit: type=1400 audit(1707818275.204:1172): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.629187 kernel: audit: type=1300 audit(1707818275.204:1172): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0014a9140 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:55.204000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0014a9140 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:57:55.749637 kernel: audit: type=1327 audit(1707818275.204:1172): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:55.204000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:57:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.934968 kernel: audit: type=1400 audit(1707818275.434:1173): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.935020 kernel: audit: type=1300 audit(1707818275.434:1173): arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00e897a60 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:57:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00e897a60 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:57:56.034606 kernel: audit: type=1327 audit(1707818275.434:1173): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:57:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:57:56.128665 kernel: audit: type=1400 audit(1707818275.434:1174): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c00fc97710 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:57:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:57:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c003d5c960 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:57:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:57:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c003d5c990 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:57:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:57:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0018dd2e0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:57:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:57:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:57:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00b65d860 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:57:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:57:58.853866 env[1473]: time="2024-02-13T09:57:58.853838857Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:57:58.869678 env[1473]: time="2024-02-13T09:57:58.869629665Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:57:58.869849 kubelet[2593]: E0213 09:57:58.869833 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:57:58.870117 kubelet[2593]: E0213 09:57:58.869869 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:57:58.870117 kubelet[2593]: E0213 09:57:58.869923 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:57:58.870117 kubelet[2593]: E0213 09:57:58.869965 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:58:00.855305 env[1473]: time="2024-02-13T09:58:00.855178815Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:58:00.884844 env[1473]: time="2024-02-13T09:58:00.884801278Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:00.885058 kubelet[2593]: E0213 09:58:00.885016 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:58:00.885058 kubelet[2593]: E0213 09:58:00.885047 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:58:00.885235 kubelet[2593]: E0213 09:58:00.885068 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:00.885235 kubelet[2593]: E0213 09:58:00.885086 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:58:01.854845 env[1473]: time="2024-02-13T09:58:01.854721758Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:58:01.909634 env[1473]: time="2024-02-13T09:58:01.909575470Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:01.909997 kubelet[2593]: E0213 09:58:01.909787 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:58:01.909997 kubelet[2593]: E0213 09:58:01.909826 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:58:01.909997 kubelet[2593]: E0213 09:58:01.909869 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:01.909997 kubelet[2593]: E0213 09:58:01.909902 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:58:04.855086 env[1473]: time="2024-02-13T09:58:04.854987709Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:58:04.905708 env[1473]: time="2024-02-13T09:58:04.905616629Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:04.905912 kubelet[2593]: E0213 09:58:04.905886 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:58:04.906262 kubelet[2593]: E0213 09:58:04.905938 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:58:04.906262 kubelet[2593]: E0213 09:58:04.905988 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:04.906262 kubelet[2593]: E0213 09:58:04.906027 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:58:09.611000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:09.655499 kernel: kauditd_printk_skb: 14 callbacks suppressed Feb 13 09:58:09.655579 kernel: audit: type=1400 audit(1707818289.611:1179): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:09.611000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001c1d7a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:09.746426 kernel: audit: type=1300 audit(1707818289.611:1179): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001c1d7a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:09.853508 env[1473]: time="2024-02-13T09:58:09.853489490Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:58:09.611000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:09.866325 env[1473]: time="2024-02-13T09:58:09.866294386Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:09.866539 kubelet[2593]: E0213 09:58:09.866500 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:58:09.866539 kubelet[2593]: E0213 09:58:09.866527 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:58:09.866719 kubelet[2593]: E0213 09:58:09.866551 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:09.866719 kubelet[2593]: E0213 09:58:09.866571 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:58:09.958099 kernel: audit: type=1327 audit(1707818289.611:1179): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:09.958131 kernel: audit: type=1400 audit(1707818289.612:1180): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:09.612000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:10.047554 kernel: audit: type=1300 audit(1707818289.612:1180): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000627b40 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:09.612000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000627b40 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:10.168061 kernel: audit: type=1327 audit(1707818289.612:1180): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:09.612000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:10.261418 kernel: audit: type=1400 audit(1707818289.615:1181): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:09.615000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:09.615000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0017d72a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:10.472197 kernel: audit: type=1300 audit(1707818289.615:1181): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0017d72a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:10.472231 kernel: audit: type=1327 audit(1707818289.615:1181): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:09.615000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:10.565621 kernel: audit: type=1400 audit(1707818289.615:1182): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:09.615000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:09.615000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0014a72e0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:09.615000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:15.854297 env[1473]: time="2024-02-13T09:58:15.854208525Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:58:15.854297 env[1473]: time="2024-02-13T09:58:15.854239639Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:58:15.872878 env[1473]: time="2024-02-13T09:58:15.872816073Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:15.872878 env[1473]: time="2024-02-13T09:58:15.872830447Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:15.873121 kubelet[2593]: E0213 09:58:15.873108 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:58:15.873287 kubelet[2593]: E0213 09:58:15.873137 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:58:15.873287 kubelet[2593]: E0213 09:58:15.873161 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:15.873287 kubelet[2593]: E0213 09:58:15.873109 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:58:15.873287 kubelet[2593]: E0213 09:58:15.873179 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:58:15.873287 kubelet[2593]: E0213 09:58:15.873195 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:58:15.873443 kubelet[2593]: E0213 09:58:15.873216 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:15.873443 kubelet[2593]: E0213 09:58:15.873231 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:58:16.855256 env[1473]: time="2024-02-13T09:58:16.855124638Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:58:16.906265 env[1473]: time="2024-02-13T09:58:16.906194303Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:16.906547 kubelet[2593]: E0213 09:58:16.906493 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:58:16.906547 kubelet[2593]: E0213 09:58:16.906541 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:58:16.906969 kubelet[2593]: E0213 09:58:16.906590 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:16.906969 kubelet[2593]: E0213 09:58:16.906630 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:58:23.854251 env[1473]: time="2024-02-13T09:58:23.854097300Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:58:23.904781 env[1473]: time="2024-02-13T09:58:23.904715672Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:23.905048 kubelet[2593]: E0213 09:58:23.904992 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:58:23.905048 kubelet[2593]: E0213 09:58:23.905045 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:58:23.905501 kubelet[2593]: E0213 09:58:23.905096 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:23.905501 kubelet[2593]: E0213 09:58:23.905137 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:58:27.854720 env[1473]: time="2024-02-13T09:58:27.854586082Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:58:27.880700 env[1473]: time="2024-02-13T09:58:27.880635220Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:27.880855 kubelet[2593]: E0213 09:58:27.880808 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:58:27.880855 kubelet[2593]: E0213 09:58:27.880833 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:58:27.880855 kubelet[2593]: E0213 09:58:27.880853 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:27.881069 kubelet[2593]: E0213 09:58:27.880870 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:58:28.855573 env[1473]: time="2024-02-13T09:58:28.855472238Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:58:28.910881 env[1473]: time="2024-02-13T09:58:28.910741842Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:28.911224 kubelet[2593]: E0213 09:58:28.911177 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:58:28.911924 kubelet[2593]: E0213 09:58:28.911264 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:58:28.911924 kubelet[2593]: E0213 09:58:28.911378 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:28.911924 kubelet[2593]: E0213 09:58:28.911460 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:58:29.855073 env[1473]: time="2024-02-13T09:58:29.854934979Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:58:29.906657 env[1473]: time="2024-02-13T09:58:29.906589915Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:29.907087 kubelet[2593]: E0213 09:58:29.906855 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:58:29.907087 kubelet[2593]: E0213 09:58:29.906902 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:58:29.907087 kubelet[2593]: E0213 09:58:29.906951 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:29.907087 kubelet[2593]: E0213 09:58:29.906991 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:58:36.854633 env[1473]: time="2024-02-13T09:58:36.854484164Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:58:36.880715 env[1473]: time="2024-02-13T09:58:36.880649879Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:36.880823 kubelet[2593]: E0213 09:58:36.880806 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:58:36.880984 kubelet[2593]: E0213 09:58:36.880832 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:58:36.880984 kubelet[2593]: E0213 09:58:36.880855 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:36.880984 kubelet[2593]: E0213 09:58:36.880872 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:58:39.854722 env[1473]: time="2024-02-13T09:58:39.854584246Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:58:39.880886 env[1473]: time="2024-02-13T09:58:39.880822584Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:39.881050 kubelet[2593]: E0213 09:58:39.881003 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:58:39.881050 kubelet[2593]: E0213 09:58:39.881030 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:58:39.881225 kubelet[2593]: E0213 09:58:39.881051 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:39.881225 kubelet[2593]: E0213 09:58:39.881074 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:58:41.854333 env[1473]: time="2024-02-13T09:58:41.854198279Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:58:41.880594 env[1473]: time="2024-02-13T09:58:41.880514350Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:41.880780 kubelet[2593]: E0213 09:58:41.880767 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:58:41.880949 kubelet[2593]: E0213 09:58:41.880816 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:58:41.880949 kubelet[2593]: E0213 09:58:41.880840 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:41.880949 kubelet[2593]: E0213 09:58:41.880857 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:58:43.854619 env[1473]: time="2024-02-13T09:58:43.854529662Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:58:43.881664 env[1473]: time="2024-02-13T09:58:43.881551075Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:43.881892 kubelet[2593]: E0213 09:58:43.881880 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:58:43.882055 kubelet[2593]: E0213 09:58:43.881907 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:58:43.882055 kubelet[2593]: E0213 09:58:43.881928 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:43.882055 kubelet[2593]: E0213 09:58:43.881947 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:58:49.854326 env[1473]: time="2024-02-13T09:58:49.854232088Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:58:49.908758 env[1473]: time="2024-02-13T09:58:49.908660849Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:49.908981 kubelet[2593]: E0213 09:58:49.908951 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:58:49.909402 kubelet[2593]: E0213 09:58:49.909020 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:58:49.909402 kubelet[2593]: E0213 09:58:49.909117 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:49.909402 kubelet[2593]: E0213 09:58:49.909172 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:58:52.854178 env[1473]: time="2024-02-13T09:58:52.854040593Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:58:52.855033 env[1473]: time="2024-02-13T09:58:52.854410726Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:58:52.903479 env[1473]: time="2024-02-13T09:58:52.903395143Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:52.903633 env[1473]: time="2024-02-13T09:58:52.903574234Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:52.903732 kubelet[2593]: E0213 09:58:52.903658 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:58:52.903732 kubelet[2593]: E0213 09:58:52.903713 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:58:52.904105 kubelet[2593]: E0213 09:58:52.903759 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:58:52.904105 kubelet[2593]: E0213 09:58:52.903777 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:52.904105 kubelet[2593]: E0213 09:58:52.903791 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:58:52.904105 kubelet[2593]: E0213 09:58:52.903831 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:52.904361 kubelet[2593]: E0213 09:58:52.903833 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:58:52.904361 kubelet[2593]: E0213 09:58:52.903863 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:58:55.204000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.233197 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 09:58:55.233255 kernel: audit: type=1400 audit(1707818335.204:1183): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.204000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006f5500 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:55.445741 kernel: audit: type=1300 audit(1707818335.204:1183): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006f5500 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:55.445773 kernel: audit: type=1327 audit(1707818335.204:1183): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:55.204000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:55.538694 kernel: audit: type=1400 audit(1707818335.204:1184): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.204000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.630085 kernel: audit: type=1300 audit(1707818335.204:1184): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0017f9530 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:55.204000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0017f9530 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:58:55.750735 kernel: audit: type=1327 audit(1707818335.204:1184): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:55.204000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:58:55.844320 kernel: audit: type=1400 audit(1707818335.434:1185): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.853859 env[1473]: time="2024-02-13T09:58:55.853819324Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:58:55.866694 env[1473]: time="2024-02-13T09:58:55.866655328Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:58:55.867080 kubelet[2593]: E0213 09:58:55.867046 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:58:55.867080 kubelet[2593]: E0213 09:58:55.867072 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:58:55.867267 kubelet[2593]: E0213 09:58:55.867092 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:58:55.867267 kubelet[2593]: E0213 09:58:55.867109 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:58:55.936011 kernel: audit: type=1300 audit(1707818335.434:1185): arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00b6aad80 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:58:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00b6aad80 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:58:56.034425 kernel: audit: type=1327 audit(1707818335.434:1185): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:58:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:58:56.127690 kernel: audit: type=1400 audit(1707818335.434:1186): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.434000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.434000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c001a19180 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:58:55.434000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:58:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00b6aaf00 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:58:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:58:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0024603e0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:58:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:58:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:58:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c010761590 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:58:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c00b6ab950 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:58:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:58:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:59:04.854715 env[1473]: time="2024-02-13T09:59:04.854614108Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:59:04.905610 env[1473]: time="2024-02-13T09:59:04.905488281Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:04.905902 kubelet[2593]: E0213 09:59:04.905833 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:59:04.905902 kubelet[2593]: E0213 09:59:04.905892 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:59:04.906518 kubelet[2593]: E0213 09:59:04.905956 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:04.906518 kubelet[2593]: E0213 09:59:04.906006 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:59:06.854711 env[1473]: time="2024-02-13T09:59:06.854623223Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:59:06.881611 env[1473]: time="2024-02-13T09:59:06.881525180Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:06.881820 kubelet[2593]: E0213 09:59:06.881808 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:59:06.881985 kubelet[2593]: E0213 09:59:06.881835 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:59:06.881985 kubelet[2593]: E0213 09:59:06.881855 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:06.881985 kubelet[2593]: E0213 09:59:06.881872 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:59:07.854837 env[1473]: time="2024-02-13T09:59:07.854705317Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:59:07.855648 env[1473]: time="2024-02-13T09:59:07.854900436Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:59:07.882469 env[1473]: time="2024-02-13T09:59:07.882351800Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:07.882684 env[1473]: time="2024-02-13T09:59:07.882517012Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:07.882724 kubelet[2593]: E0213 09:59:07.882632 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:59:07.882724 kubelet[2593]: E0213 09:59:07.882658 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:59:07.882724 kubelet[2593]: E0213 09:59:07.882692 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:59:07.882724 kubelet[2593]: E0213 09:59:07.882693 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:59:07.882917 kubelet[2593]: E0213 09:59:07.882726 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:07.882917 kubelet[2593]: E0213 09:59:07.882726 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:07.882917 kubelet[2593]: E0213 09:59:07.882744 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:59:07.883020 kubelet[2593]: E0213 09:59:07.882742 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:59:09.613000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:09.640975 kernel: kauditd_printk_skb: 14 callbacks suppressed Feb 13 09:59:09.641043 kernel: audit: type=1400 audit(1707818349.613:1191): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:09.613000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0012c4fc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:09.851053 kernel: audit: type=1300 audit(1707818349.613:1191): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0012c4fc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:09.851101 kernel: audit: type=1327 audit(1707818349.613:1191): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:09.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:09.945423 kernel: audit: type=1400 audit(1707818349.614:1192): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:09.614000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:10.035112 kernel: audit: type=1300 audit(1707818349.614:1192): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0012c4fe0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:09.614000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0012c4fe0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:10.155603 kernel: audit: type=1327 audit(1707818349.614:1192): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:09.614000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:10.249158 kernel: audit: type=1400 audit(1707818349.617:1193): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:09.617000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:10.340379 kernel: audit: type=1300 audit(1707818349.617:1193): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0019893a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:09.617000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0019893a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:10.463631 kernel: audit: type=1327 audit(1707818349.617:1193): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:09.617000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:09.617000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:10.647088 kernel: audit: type=1400 audit(1707818349.617:1194): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:09.617000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c000fd5c40 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:09.617000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:18.855592 env[1473]: time="2024-02-13T09:59:18.855474859Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:59:18.906391 env[1473]: time="2024-02-13T09:59:18.906291532Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:18.906638 kubelet[2593]: E0213 09:59:18.906581 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:59:18.906638 kubelet[2593]: E0213 09:59:18.906626 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:59:18.907094 kubelet[2593]: E0213 09:59:18.906676 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:18.907094 kubelet[2593]: E0213 09:59:18.906716 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:59:19.854664 env[1473]: time="2024-02-13T09:59:19.854577097Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:59:19.908797 env[1473]: time="2024-02-13T09:59:19.908711459Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:19.909159 kubelet[2593]: E0213 09:59:19.908959 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:59:19.909159 kubelet[2593]: E0213 09:59:19.909000 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:59:19.909159 kubelet[2593]: E0213 09:59:19.909041 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:19.909159 kubelet[2593]: E0213 09:59:19.909074 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:59:20.854922 env[1473]: time="2024-02-13T09:59:20.854786984Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:59:20.899480 env[1473]: time="2024-02-13T09:59:20.899377953Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:20.899660 kubelet[2593]: E0213 09:59:20.899630 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:59:20.899749 kubelet[2593]: E0213 09:59:20.899679 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:59:20.899749 kubelet[2593]: E0213 09:59:20.899728 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:20.899912 kubelet[2593]: E0213 09:59:20.899771 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:59:22.854754 env[1473]: time="2024-02-13T09:59:22.854657629Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:59:22.905772 env[1473]: time="2024-02-13T09:59:22.905712079Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:22.905956 kubelet[2593]: E0213 09:59:22.905936 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:59:22.906286 kubelet[2593]: E0213 09:59:22.905978 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:59:22.906286 kubelet[2593]: E0213 09:59:22.906026 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:22.906286 kubelet[2593]: E0213 09:59:22.906059 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:59:29.854474 env[1473]: time="2024-02-13T09:59:29.854311249Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:59:29.871209 env[1473]: time="2024-02-13T09:59:29.871121410Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:29.871324 kubelet[2593]: E0213 09:59:29.871297 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:59:29.871503 kubelet[2593]: E0213 09:59:29.871330 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:59:29.871503 kubelet[2593]: E0213 09:59:29.871363 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:29.871503 kubelet[2593]: E0213 09:59:29.871381 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:59:31.854886 env[1473]: time="2024-02-13T09:59:31.854755704Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:59:31.881628 env[1473]: time="2024-02-13T09:59:31.881591017Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:31.881835 kubelet[2593]: E0213 09:59:31.881823 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:59:31.882018 kubelet[2593]: E0213 09:59:31.881852 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:59:31.882018 kubelet[2593]: E0213 09:59:31.881884 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:31.882018 kubelet[2593]: E0213 09:59:31.881910 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:59:33.854118 env[1473]: time="2024-02-13T09:59:33.853985496Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:59:33.880912 env[1473]: time="2024-02-13T09:59:33.880853101Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:33.881178 kubelet[2593]: E0213 09:59:33.881126 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:59:33.881178 kubelet[2593]: E0213 09:59:33.881165 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:59:33.881380 kubelet[2593]: E0213 09:59:33.881186 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:33.881380 kubelet[2593]: E0213 09:59:33.881205 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:59:34.855215 env[1473]: time="2024-02-13T09:59:34.855068346Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:59:34.884912 env[1473]: time="2024-02-13T09:59:34.884874172Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:34.885196 kubelet[2593]: E0213 09:59:34.885139 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:59:34.885196 kubelet[2593]: E0213 09:59:34.885193 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:59:34.885411 kubelet[2593]: E0213 09:59:34.885218 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:34.885411 kubelet[2593]: E0213 09:59:34.885236 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:59:42.854819 env[1473]: time="2024-02-13T09:59:42.854674621Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:59:42.907749 env[1473]: time="2024-02-13T09:59:42.907677086Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:42.907994 kubelet[2593]: E0213 09:59:42.907965 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:59:42.908568 kubelet[2593]: E0213 09:59:42.908015 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:59:42.908568 kubelet[2593]: E0213 09:59:42.908068 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:42.908568 kubelet[2593]: E0213 09:59:42.908110 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:59:44.855166 env[1473]: time="2024-02-13T09:59:44.855007438Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:59:44.905937 env[1473]: time="2024-02-13T09:59:44.905834362Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:44.906134 kubelet[2593]: E0213 09:59:44.906108 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:59:44.906442 kubelet[2593]: E0213 09:59:44.906145 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:59:44.906442 kubelet[2593]: E0213 09:59:44.906185 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:44.906442 kubelet[2593]: E0213 09:59:44.906216 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 09:59:46.855263 env[1473]: time="2024-02-13T09:59:46.855121855Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 09:59:46.872182 env[1473]: time="2024-02-13T09:59:46.872116152Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:46.872339 kubelet[2593]: E0213 09:59:46.872325 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 09:59:46.872519 kubelet[2593]: E0213 09:59:46.872359 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 09:59:46.872519 kubelet[2593]: E0213 09:59:46.872386 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:46.872519 kubelet[2593]: E0213 09:59:46.872406 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 09:59:48.854322 env[1473]: time="2024-02-13T09:59:48.854257686Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 09:59:48.873760 env[1473]: time="2024-02-13T09:59:48.873637301Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:48.873956 kubelet[2593]: E0213 09:59:48.873937 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 09:59:48.874171 kubelet[2593]: E0213 09:59:48.873966 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 09:59:48.874171 kubelet[2593]: E0213 09:59:48.874002 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:48.874171 kubelet[2593]: E0213 09:59:48.874033 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 09:59:55.205000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.234108 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 09:59:55.234194 kernel: audit: type=1400 audit(1707818395.205:1195): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.205000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0024bb800 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:55.447496 kernel: audit: type=1300 audit(1707818395.205:1195): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0024bb800 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:55.447632 kernel: audit: type=1327 audit(1707818395.205:1195): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:55.205000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:55.205000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.630059 kernel: audit: type=1400 audit(1707818395.205:1196): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.630098 kernel: audit: type=1300 audit(1707818395.205:1196): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001d214c0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:55.205000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001d214c0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 09:59:55.750405 kernel: audit: type=1327 audit(1707818395.205:1196): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:55.205000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 09:59:55.843560 kernel: audit: type=1400 audit(1707818395.435:1197): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.935352 kernel: audit: type=1300 audit(1707818395.435:1197): arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c012de92f0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:59:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c012de92f0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:59:56.034706 kernel: audit: type=1327 audit(1707818395.435:1197): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:59:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:59:56.129514 kernel: audit: type=1400 audit(1707818395.435:1198): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00426e5c0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:59:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:59:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00c61b710 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:59:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:59:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0115e16b0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:59:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c00426e5e0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:59:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:59:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:59:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 09:59:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c0143c9c20 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 09:59:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 09:59:56.855081 env[1473]: time="2024-02-13T09:59:56.854995726Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 09:59:56.880029 env[1473]: time="2024-02-13T09:59:56.879967617Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:56.880156 kubelet[2593]: E0213 09:59:56.880145 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 09:59:56.880317 kubelet[2593]: E0213 09:59:56.880174 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 09:59:56.880317 kubelet[2593]: E0213 09:59:56.880197 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:56.880317 kubelet[2593]: E0213 09:59:56.880215 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 09:59:59.855204 env[1473]: time="2024-02-13T09:59:59.855085184Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 09:59:59.881489 env[1473]: time="2024-02-13T09:59:59.881425573Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 09:59:59.881625 kubelet[2593]: E0213 09:59:59.881611 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 09:59:59.881802 kubelet[2593]: E0213 09:59:59.881642 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 09:59:59.881802 kubelet[2593]: E0213 09:59:59.881673 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 09:59:59.881802 kubelet[2593]: E0213 09:59:59.881700 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:00:00.854658 env[1473]: time="2024-02-13T10:00:00.854531824Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:00:00.870794 env[1473]: time="2024-02-13T10:00:00.870733452Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:00.871029 kubelet[2593]: E0213 10:00:00.870901 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:00:00.871029 kubelet[2593]: E0213 10:00:00.870931 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:00:00.871029 kubelet[2593]: E0213 10:00:00.870956 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:00.871029 kubelet[2593]: E0213 10:00:00.870976 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:00:03.854131 env[1473]: time="2024-02-13T10:00:03.854032215Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:00:03.880996 env[1473]: time="2024-02-13T10:00:03.880933430Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:03.881198 kubelet[2593]: E0213 10:00:03.881164 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:00:03.881198 kubelet[2593]: E0213 10:00:03.881196 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:00:03.881469 kubelet[2593]: E0213 10:00:03.881225 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:03.881469 kubelet[2593]: E0213 10:00:03.881251 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:00:07.854621 env[1473]: time="2024-02-13T10:00:07.854481493Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:00:07.905318 env[1473]: time="2024-02-13T10:00:07.905189957Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:07.905530 kubelet[2593]: E0213 10:00:07.905487 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:00:07.905926 kubelet[2593]: E0213 10:00:07.905549 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:00:07.905926 kubelet[2593]: E0213 10:00:07.905600 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:07.905926 kubelet[2593]: E0213 10:00:07.905640 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:00:09.612000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:09.641275 kernel: kauditd_printk_skb: 14 callbacks suppressed Feb 13 10:00:09.641350 kernel: audit: type=1400 audit(1707818409.612:1203): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:09.612000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000225da0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:09.852255 kernel: audit: type=1300 audit(1707818409.612:1203): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000225da0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:09.852286 kernel: audit: type=1327 audit(1707818409.612:1203): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:09.612000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:09.613000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:10.036688 kernel: audit: type=1400 audit(1707818409.613:1204): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:10.036756 kernel: audit: type=1300 audit(1707818409.613:1204): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000225de0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:09.613000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000225de0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:10.157275 kernel: audit: type=1327 audit(1707818409.613:1204): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:09.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:10.252313 kernel: audit: type=1400 audit(1707818409.616:1205): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:09.616000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:10.342407 kernel: audit: type=1300 audit(1707818409.616:1205): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002761be0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:09.616000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002761be0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:10.462895 kernel: audit: type=1327 audit(1707818409.616:1205): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:09.616000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:10.556726 kernel: audit: type=1400 audit(1707818409.616:1206): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:09.616000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:09.616000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c000225e20 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:09.616000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:10.854736 env[1473]: time="2024-02-13T10:00:10.854502889Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:00:10.880001 env[1473]: time="2024-02-13T10:00:10.879939607Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:10.880191 kubelet[2593]: E0213 10:00:10.880154 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:00:10.880191 kubelet[2593]: E0213 10:00:10.880179 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:00:10.880405 kubelet[2593]: E0213 10:00:10.880200 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:10.880405 kubelet[2593]: E0213 10:00:10.880224 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:00:11.854831 env[1473]: time="2024-02-13T10:00:11.854719950Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:00:11.880163 env[1473]: time="2024-02-13T10:00:11.880101350Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:11.880274 kubelet[2593]: E0213 10:00:11.880262 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:00:11.880443 kubelet[2593]: E0213 10:00:11.880290 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:00:11.880443 kubelet[2593]: E0213 10:00:11.880315 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:11.880443 kubelet[2593]: E0213 10:00:11.880334 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:00:17.854974 env[1473]: time="2024-02-13T10:00:17.854878738Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:00:17.905671 env[1473]: time="2024-02-13T10:00:17.905637587Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:17.905841 kubelet[2593]: E0213 10:00:17.905828 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:00:17.906013 kubelet[2593]: E0213 10:00:17.905854 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:00:17.906013 kubelet[2593]: E0213 10:00:17.905876 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:17.906013 kubelet[2593]: E0213 10:00:17.905893 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:00:22.855059 env[1473]: time="2024-02-13T10:00:22.854971391Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:00:22.855059 env[1473]: time="2024-02-13T10:00:22.854971439Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:00:22.870040 env[1473]: time="2024-02-13T10:00:22.869970950Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:22.870180 kubelet[2593]: E0213 10:00:22.870166 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:00:22.870370 kubelet[2593]: E0213 10:00:22.870198 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:00:22.870370 kubelet[2593]: E0213 10:00:22.870226 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:22.870370 kubelet[2593]: E0213 10:00:22.870249 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:00:22.870492 env[1473]: time="2024-02-13T10:00:22.870438608Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:22.870597 kubelet[2593]: E0213 10:00:22.870560 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:00:22.870597 kubelet[2593]: E0213 10:00:22.870573 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:00:22.870597 kubelet[2593]: E0213 10:00:22.870596 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:22.870709 kubelet[2593]: E0213 10:00:22.870612 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:00:23.854709 env[1473]: time="2024-02-13T10:00:23.854633459Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:00:23.871874 env[1473]: time="2024-02-13T10:00:23.871840581Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:23.872103 kubelet[2593]: E0213 10:00:23.872030 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:00:23.872103 kubelet[2593]: E0213 10:00:23.872056 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:00:23.872103 kubelet[2593]: E0213 10:00:23.872078 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:23.872103 kubelet[2593]: E0213 10:00:23.872098 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:00:28.854934 env[1473]: time="2024-02-13T10:00:28.854778841Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:00:28.906480 env[1473]: time="2024-02-13T10:00:28.906374634Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:28.907915 kubelet[2593]: E0213 10:00:28.907889 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:00:28.908356 kubelet[2593]: E0213 10:00:28.907941 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:00:28.908356 kubelet[2593]: E0213 10:00:28.907994 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:28.908356 kubelet[2593]: E0213 10:00:28.908037 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:00:33.854928 env[1473]: time="2024-02-13T10:00:33.854834835Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:00:33.880842 env[1473]: time="2024-02-13T10:00:33.880774739Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:33.881013 kubelet[2593]: E0213 10:00:33.880960 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:00:33.881013 kubelet[2593]: E0213 10:00:33.880987 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:00:33.881013 kubelet[2593]: E0213 10:00:33.881009 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:33.881234 kubelet[2593]: E0213 10:00:33.881029 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:00:34.855196 env[1473]: time="2024-02-13T10:00:34.855065837Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:00:34.873938 env[1473]: time="2024-02-13T10:00:34.873874470Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:34.874086 kubelet[2593]: E0213 10:00:34.874068 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:00:34.874150 kubelet[2593]: E0213 10:00:34.874114 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:00:34.874184 kubelet[2593]: E0213 10:00:34.874171 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:34.874239 kubelet[2593]: E0213 10:00:34.874202 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:00:37.854390 env[1473]: time="2024-02-13T10:00:37.854226994Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:00:37.879903 env[1473]: time="2024-02-13T10:00:37.879839664Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:37.880076 kubelet[2593]: E0213 10:00:37.880028 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:00:37.880076 kubelet[2593]: E0213 10:00:37.880056 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:00:37.880253 kubelet[2593]: E0213 10:00:37.880079 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:37.880253 kubelet[2593]: E0213 10:00:37.880097 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:00:42.855054 env[1473]: time="2024-02-13T10:00:42.854921059Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:00:42.871788 env[1473]: time="2024-02-13T10:00:42.871723863Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:42.871889 kubelet[2593]: E0213 10:00:42.871873 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:00:42.872064 kubelet[2593]: E0213 10:00:42.871898 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:00:42.872064 kubelet[2593]: E0213 10:00:42.871921 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:42.872064 kubelet[2593]: E0213 10:00:42.871940 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:00:45.854009 env[1473]: time="2024-02-13T10:00:45.853930277Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:00:45.879643 env[1473]: time="2024-02-13T10:00:45.879605928Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:45.879837 kubelet[2593]: E0213 10:00:45.879798 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:00:45.879837 kubelet[2593]: E0213 10:00:45.879824 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:00:45.880022 kubelet[2593]: E0213 10:00:45.879846 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:45.880022 kubelet[2593]: E0213 10:00:45.879866 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:00:46.854086 env[1473]: time="2024-02-13T10:00:46.853978003Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:00:46.894691 env[1473]: time="2024-02-13T10:00:46.894629796Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:46.894855 kubelet[2593]: E0213 10:00:46.894840 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:00:46.895154 kubelet[2593]: E0213 10:00:46.894883 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:00:46.895154 kubelet[2593]: E0213 10:00:46.894926 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:46.895154 kubelet[2593]: E0213 10:00:46.894958 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:00:48.855516 env[1473]: time="2024-02-13T10:00:48.855387337Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:00:48.906921 env[1473]: time="2024-02-13T10:00:48.906849711Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:48.907185 kubelet[2593]: E0213 10:00:48.907139 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:00:48.907588 kubelet[2593]: E0213 10:00:48.907187 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:00:48.907588 kubelet[2593]: E0213 10:00:48.907241 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:48.907588 kubelet[2593]: E0213 10:00:48.907281 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:00:55.206000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.234560 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 10:00:55.234643 kernel: audit: type=1400 audit(1707818455.206:1208): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.206000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.415469 kernel: audit: type=1400 audit(1707818455.206:1207): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.415521 kernel: audit: type=1300 audit(1707818455.206:1208): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001510480 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:55.206000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001510480 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:55.537538 kernel: audit: type=1300 audit(1707818455.206:1207): arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c00229b080 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:55.206000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c00229b080 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:00:55.206000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:55.751143 kernel: audit: type=1327 audit(1707818455.206:1208): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:55.751174 kernel: audit: type=1327 audit(1707818455.206:1207): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:55.206000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:00:55.844195 kernel: audit: type=1400 audit(1707818455.435:1209): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.935193 kernel: audit: type=1300 audit(1707818455.435:1209): arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00b429290 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:00:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00b429290 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:00:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:00:56.126506 kernel: audit: type=1327 audit(1707818455.435:1209): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:00:56.126539 kernel: audit: type=1400 audit(1707818455.435:1210): avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.435000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.435000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0022d0e00 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:00:55.435000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:00:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00b429350 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:00:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:00:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00efa0e20 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:00:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:00:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c015c78300 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:00:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:00:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:00:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c011e67740 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:00:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:00:56.855529 env[1473]: time="2024-02-13T10:00:56.855010801Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:00:56.856645 env[1473]: time="2024-02-13T10:00:56.855597363Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:00:56.871036 env[1473]: time="2024-02-13T10:00:56.870972050Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:56.871036 env[1473]: time="2024-02-13T10:00:56.870988077Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:00:56.871167 kubelet[2593]: E0213 10:00:56.871115 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:00:56.871167 kubelet[2593]: E0213 10:00:56.871133 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:00:56.871167 kubelet[2593]: E0213 10:00:56.871145 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:00:56.871167 kubelet[2593]: E0213 10:00:56.871149 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:00:56.871167 kubelet[2593]: E0213 10:00:56.871166 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:56.871437 kubelet[2593]: E0213 10:00:56.871168 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:00:56.871437 kubelet[2593]: E0213 10:00:56.871185 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:00:56.871437 kubelet[2593]: E0213 10:00:56.871185 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:01:01.854878 env[1473]: time="2024-02-13T10:01:01.854744352Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:01:01.854878 env[1473]: time="2024-02-13T10:01:01.854799153Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:01:01.881479 env[1473]: time="2024-02-13T10:01:01.881399816Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:01.881663 kubelet[2593]: E0213 10:01:01.881622 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:01:01.881663 kubelet[2593]: E0213 10:01:01.881652 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:01:01.881859 kubelet[2593]: E0213 10:01:01.881675 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:01.881859 kubelet[2593]: E0213 10:01:01.881693 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:01:01.881859 kubelet[2593]: E0213 10:01:01.881768 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:01:01.881859 kubelet[2593]: E0213 10:01:01.881781 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:01:01.881978 env[1473]: time="2024-02-13T10:01:01.881673786Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:01.882001 kubelet[2593]: E0213 10:01:01.881799 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:01.882001 kubelet[2593]: E0213 10:01:01.881813 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:01:09.613000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:09.641149 kernel: kauditd_printk_skb: 14 callbacks suppressed Feb 13 10:01:09.641205 kernel: audit: type=1400 audit(1707818469.613:1215): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:09.613000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0024de340 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:09.850258 kernel: audit: type=1300 audit(1707818469.613:1215): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0024de340 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:09.850294 kernel: audit: type=1327 audit(1707818469.613:1215): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:09.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:09.614000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:10.031939 kernel: audit: type=1400 audit(1707818469.614:1216): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:10.031973 kernel: audit: type=1300 audit(1707818469.614:1216): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000dfcdc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:09.614000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000dfcdc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:10.152376 kernel: audit: type=1327 audit(1707818469.614:1216): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:09.614000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:10.245681 kernel: audit: type=1400 audit(1707818469.616:1217): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:09.616000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:10.335699 kernel: audit: type=1300 audit(1707818469.616:1217): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0029a8240 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:09.616000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0029a8240 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:10.455846 kernel: audit: type=1327 audit(1707818469.616:1217): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:09.616000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:10.506452 systemd[1]: Started sshd@7-139.178.70.43:22-139.178.68.195:45626.service. Feb 13 10:01:10.548835 kernel: audit: type=1400 audit(1707818469.616:1218): avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:09.616000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:09.616000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c000dfcde0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:09.616000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:10.505000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-139.178.70.43:22-139.178.68.195:45626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:10.658000 audit[6597]: USER_ACCT pid=6597 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:10.660142 sshd[6597]: Accepted publickey for core from 139.178.68.195 port 45626 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:10.659000 audit[6597]: CRED_ACQ pid=6597 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:10.659000 audit[6597]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc7f27ee30 a2=3 a3=0 items=0 ppid=1 pid=6597 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:10.659000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:10.660856 sshd[6597]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:10.663427 systemd-logind[1461]: New session 10 of user core. Feb 13 10:01:10.663936 systemd[1]: Started session-10.scope. Feb 13 10:01:10.664000 audit[6597]: USER_START pid=6597 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:10.665000 audit[6599]: CRED_ACQ pid=6599 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:10.750754 sshd[6597]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:10.750000 audit[6597]: USER_END pid=6597 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:10.750000 audit[6597]: CRED_DISP pid=6597 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:10.752124 systemd[1]: sshd@7-139.178.70.43:22-139.178.68.195:45626.service: Deactivated successfully. Feb 13 10:01:10.750000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-139.178.70.43:22-139.178.68.195:45626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:10.752596 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 10:01:10.753034 systemd-logind[1461]: Session 10 logged out. Waiting for processes to exit. Feb 13 10:01:10.753661 systemd-logind[1461]: Removed session 10. Feb 13 10:01:10.855043 env[1473]: time="2024-02-13T10:01:10.854836913Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:01:10.855043 env[1473]: time="2024-02-13T10:01:10.854927378Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:01:10.872576 env[1473]: time="2024-02-13T10:01:10.872498510Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:10.872697 env[1473]: time="2024-02-13T10:01:10.872663875Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:10.872743 kubelet[2593]: E0213 10:01:10.872725 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:01:10.872927 kubelet[2593]: E0213 10:01:10.872755 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:01:10.872927 kubelet[2593]: E0213 10:01:10.872782 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:10.872927 kubelet[2593]: E0213 10:01:10.872798 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:01:10.872927 kubelet[2593]: E0213 10:01:10.872802 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:01:10.872927 kubelet[2593]: E0213 10:01:10.872815 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:01:10.873078 kubelet[2593]: E0213 10:01:10.872837 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:10.873078 kubelet[2593]: E0213 10:01:10.872854 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:01:13.854579 env[1473]: time="2024-02-13T10:01:13.854462614Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:01:13.879223 env[1473]: time="2024-02-13T10:01:13.879158832Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:13.879412 kubelet[2593]: E0213 10:01:13.879395 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:01:13.879564 kubelet[2593]: E0213 10:01:13.879422 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:01:13.879564 kubelet[2593]: E0213 10:01:13.879446 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:13.879564 kubelet[2593]: E0213 10:01:13.879463 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:01:15.760062 systemd[1]: Started sshd@8-139.178.70.43:22-139.178.68.195:45638.service. Feb 13 10:01:15.758000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-139.178.70.43:22-139.178.68.195:45638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:15.787553 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 10:01:15.787642 kernel: audit: type=1130 audit(1707818475.758:1228): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-139.178.70.43:22-139.178.68.195:45638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:15.854034 env[1473]: time="2024-02-13T10:01:15.853999498Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:01:15.867096 env[1473]: time="2024-02-13T10:01:15.867037178Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:15.867285 kubelet[2593]: E0213 10:01:15.867274 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:01:15.867496 kubelet[2593]: E0213 10:01:15.867302 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:01:15.867496 kubelet[2593]: E0213 10:01:15.867329 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:15.867496 kubelet[2593]: E0213 10:01:15.867371 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:01:15.898000 audit[6718]: USER_ACCT pid=6718 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:15.899844 sshd[6718]: Accepted publickey for core from 139.178.68.195 port 45638 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:15.901662 sshd[6718]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:15.904095 systemd-logind[1461]: New session 11 of user core. Feb 13 10:01:15.904537 systemd[1]: Started session-11.scope. Feb 13 10:01:15.982561 sshd[6718]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:15.984023 systemd[1]: sshd@8-139.178.70.43:22-139.178.68.195:45638.service: Deactivated successfully. Feb 13 10:01:15.984453 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 10:01:15.984852 systemd-logind[1461]: Session 11 logged out. Waiting for processes to exit. Feb 13 10:01:15.985292 systemd-logind[1461]: Removed session 11. Feb 13 10:01:15.900000 audit[6718]: CRED_ACQ pid=6718 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:16.082313 kernel: audit: type=1101 audit(1707818475.898:1229): pid=6718 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:16.082355 kernel: audit: type=1103 audit(1707818475.900:1230): pid=6718 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:16.082377 kernel: audit: type=1006 audit(1707818475.900:1231): pid=6718 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Feb 13 10:01:16.140904 kernel: audit: type=1300 audit(1707818475.900:1231): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd3134c1a0 a2=3 a3=0 items=0 ppid=1 pid=6718 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:15.900000 audit[6718]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd3134c1a0 a2=3 a3=0 items=0 ppid=1 pid=6718 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:16.233429 kernel: audit: type=1327 audit(1707818475.900:1231): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:15.900000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:16.264243 kernel: audit: type=1105 audit(1707818475.905:1232): pid=6718 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:15.905000 audit[6718]: USER_START pid=6718 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:16.359608 kernel: audit: type=1103 audit(1707818475.905:1233): pid=6749 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:15.905000 audit[6749]: CRED_ACQ pid=6749 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:16.448820 kernel: audit: type=1106 audit(1707818475.981:1234): pid=6718 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:15.981000 audit[6718]: USER_END pid=6718 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:16.544523 kernel: audit: type=1104 audit(1707818475.982:1235): pid=6718 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:15.982000 audit[6718]: CRED_DISP pid=6718 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:15.982000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-139.178.70.43:22-139.178.68.195:45638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:20.992632 systemd[1]: Started sshd@9-139.178.70.43:22-139.178.68.195:42114.service. Feb 13 10:01:20.991000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-139.178.70.43:22-139.178.68.195:42114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:21.019339 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:01:21.019378 kernel: audit: type=1130 audit(1707818480.991:1237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-139.178.70.43:22-139.178.68.195:42114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:21.126000 audit[6775]: USER_ACCT pid=6775 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.127783 sshd[6775]: Accepted publickey for core from 139.178.68.195 port 42114 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:21.128660 sshd[6775]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:21.130950 systemd-logind[1461]: New session 12 of user core. Feb 13 10:01:21.131482 systemd[1]: Started session-12.scope. Feb 13 10:01:21.209467 sshd[6775]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:21.210951 systemd[1]: sshd@9-139.178.70.43:22-139.178.68.195:42114.service: Deactivated successfully. Feb 13 10:01:21.211379 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 10:01:21.211791 systemd-logind[1461]: Session 12 logged out. Waiting for processes to exit. Feb 13 10:01:21.212267 systemd-logind[1461]: Removed session 12. Feb 13 10:01:21.127000 audit[6775]: CRED_ACQ pid=6775 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.309903 kernel: audit: type=1101 audit(1707818481.126:1238): pid=6775 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.309949 kernel: audit: type=1103 audit(1707818481.127:1239): pid=6775 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.309979 kernel: audit: type=1006 audit(1707818481.127:1240): pid=6775 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Feb 13 10:01:21.368362 kernel: audit: type=1300 audit(1707818481.127:1240): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe91acfe0 a2=3 a3=0 items=0 ppid=1 pid=6775 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:21.127000 audit[6775]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe91acfe0 a2=3 a3=0 items=0 ppid=1 pid=6775 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:21.127000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:21.490876 kernel: audit: type=1327 audit(1707818481.127:1240): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:21.490975 kernel: audit: type=1105 audit(1707818481.132:1241): pid=6775 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.132000 audit[6775]: USER_START pid=6775 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.133000 audit[6777]: CRED_ACQ pid=6777 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.674798 kernel: audit: type=1103 audit(1707818481.133:1242): pid=6777 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.674840 kernel: audit: type=1106 audit(1707818481.208:1243): pid=6775 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.208000 audit[6775]: USER_END pid=6775 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.770417 kernel: audit: type=1104 audit(1707818481.208:1244): pid=6775 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.208000 audit[6775]: CRED_DISP pid=6775 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:21.209000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-139.178.70.43:22-139.178.68.195:42114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:24.854871 env[1473]: time="2024-02-13T10:01:24.854730501Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:01:24.855809 env[1473]: time="2024-02-13T10:01:24.855030772Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:01:24.855809 env[1473]: time="2024-02-13T10:01:24.855073516Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:01:24.873211 env[1473]: time="2024-02-13T10:01:24.873166911Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:24.873211 env[1473]: time="2024-02-13T10:01:24.873199185Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:24.873449 kubelet[2593]: E0213 10:01:24.873394 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:01:24.873615 kubelet[2593]: E0213 10:01:24.873454 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:01:24.873615 kubelet[2593]: E0213 10:01:24.873394 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:01:24.873615 kubelet[2593]: E0213 10:01:24.873477 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:24.873615 kubelet[2593]: E0213 10:01:24.873491 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:01:24.873615 kubelet[2593]: E0213 10:01:24.873500 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:01:24.873747 env[1473]: time="2024-02-13T10:01:24.873472896Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:24.873770 kubelet[2593]: E0213 10:01:24.873510 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:24.873770 kubelet[2593]: E0213 10:01:24.873525 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:01:24.873770 kubelet[2593]: E0213 10:01:24.873565 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:01:24.873770 kubelet[2593]: E0213 10:01:24.873573 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:01:24.873870 kubelet[2593]: E0213 10:01:24.873587 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:24.873870 kubelet[2593]: E0213 10:01:24.873600 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:01:26.218889 systemd[1]: Started sshd@10-139.178.70.43:22-139.178.68.195:41612.service. Feb 13 10:01:26.217000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-139.178.70.43:22-139.178.68.195:41612 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:26.245511 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:01:26.245551 kernel: audit: type=1130 audit(1707818486.217:1246): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-139.178.70.43:22-139.178.68.195:41612 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:26.354000 audit[6886]: USER_ACCT pid=6886 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.355543 sshd[6886]: Accepted publickey for core from 139.178.68.195 port 41612 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:26.356673 sshd[6886]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:26.359064 systemd-logind[1461]: New session 13 of user core. Feb 13 10:01:26.359600 systemd[1]: Started session-13.scope. Feb 13 10:01:26.438272 sshd[6886]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:26.439818 systemd[1]: sshd@10-139.178.70.43:22-139.178.68.195:41612.service: Deactivated successfully. Feb 13 10:01:26.440246 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 10:01:26.440632 systemd-logind[1461]: Session 13 logged out. Waiting for processes to exit. Feb 13 10:01:26.441120 systemd-logind[1461]: Removed session 13. Feb 13 10:01:26.355000 audit[6886]: CRED_ACQ pid=6886 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.537527 kernel: audit: type=1101 audit(1707818486.354:1247): pid=6886 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.537570 kernel: audit: type=1103 audit(1707818486.355:1248): pid=6886 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.537588 kernel: audit: type=1006 audit(1707818486.355:1249): pid=6886 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Feb 13 10:01:26.596125 kernel: audit: type=1300 audit(1707818486.355:1249): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe90880e60 a2=3 a3=0 items=0 ppid=1 pid=6886 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:26.355000 audit[6886]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe90880e60 a2=3 a3=0 items=0 ppid=1 pid=6886 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:26.688189 kernel: audit: type=1327 audit(1707818486.355:1249): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:26.355000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:26.718725 kernel: audit: type=1105 audit(1707818486.360:1250): pid=6886 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.360000 audit[6886]: USER_START pid=6886 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.813308 kernel: audit: type=1103 audit(1707818486.360:1251): pid=6888 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.360000 audit[6888]: CRED_ACQ pid=6888 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.437000 audit[6886]: USER_END pid=6886 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.998191 kernel: audit: type=1106 audit(1707818486.437:1252): pid=6886 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.998226 kernel: audit: type=1104 audit(1707818486.437:1253): pid=6886 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.437000 audit[6886]: CRED_DISP pid=6886 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:26.438000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-139.178.70.43:22-139.178.68.195:41612 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:27.854165 env[1473]: time="2024-02-13T10:01:27.854075424Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:01:27.902612 env[1473]: time="2024-02-13T10:01:27.902512651Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:27.902794 kubelet[2593]: E0213 10:01:27.902753 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:01:27.902794 kubelet[2593]: E0213 10:01:27.902795 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:01:27.903225 kubelet[2593]: E0213 10:01:27.902846 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:27.903225 kubelet[2593]: E0213 10:01:27.902888 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:01:31.447786 systemd[1]: Started sshd@11-139.178.70.43:22-139.178.68.195:41622.service. Feb 13 10:01:31.446000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-139.178.70.43:22-139.178.68.195:41622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:31.474807 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:01:31.474870 kernel: audit: type=1130 audit(1707818491.446:1255): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-139.178.70.43:22-139.178.68.195:41622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:31.583000 audit[6943]: USER_ACCT pid=6943 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.584709 sshd[6943]: Accepted publickey for core from 139.178.68.195 port 41622 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:31.586262 sshd[6943]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:31.588711 systemd-logind[1461]: New session 14 of user core. Feb 13 10:01:31.589169 systemd[1]: Started session-14.scope. Feb 13 10:01:31.671279 sshd[6943]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:31.672908 systemd[1]: sshd@11-139.178.70.43:22-139.178.68.195:41622.service: Deactivated successfully. Feb 13 10:01:31.673326 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 10:01:31.673722 systemd-logind[1461]: Session 14 logged out. Waiting for processes to exit. Feb 13 10:01:31.674145 systemd-logind[1461]: Removed session 14. Feb 13 10:01:31.584000 audit[6943]: CRED_ACQ pid=6943 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.768479 kernel: audit: type=1101 audit(1707818491.583:1256): pid=6943 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.768518 kernel: audit: type=1103 audit(1707818491.584:1257): pid=6943 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.768534 kernel: audit: type=1006 audit(1707818491.584:1258): pid=6943 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Feb 13 10:01:31.827059 kernel: audit: type=1300 audit(1707818491.584:1258): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc8cbf5670 a2=3 a3=0 items=0 ppid=1 pid=6943 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:31.584000 audit[6943]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc8cbf5670 a2=3 a3=0 items=0 ppid=1 pid=6943 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:31.919077 kernel: audit: type=1327 audit(1707818491.584:1258): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:31.584000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:31.949568 kernel: audit: type=1105 audit(1707818491.590:1259): pid=6943 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.590000 audit[6943]: USER_START pid=6943 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:32.044048 kernel: audit: type=1103 audit(1707818491.590:1260): pid=6945 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.590000 audit[6945]: CRED_ACQ pid=6945 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:32.133280 kernel: audit: type=1106 audit(1707818491.670:1261): pid=6943 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.670000 audit[6943]: USER_END pid=6943 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:32.228831 kernel: audit: type=1104 audit(1707818491.670:1262): pid=6943 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.670000 audit[6943]: CRED_DISP pid=6943 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:31.671000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-139.178.70.43:22-139.178.68.195:41622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:35.854657 env[1473]: time="2024-02-13T10:01:35.854509871Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:01:35.854657 env[1473]: time="2024-02-13T10:01:35.854622817Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:01:35.884330 env[1473]: time="2024-02-13T10:01:35.884292137Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:35.884330 env[1473]: time="2024-02-13T10:01:35.884298353Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:35.884560 kubelet[2593]: E0213 10:01:35.884534 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:01:35.884732 kubelet[2593]: E0213 10:01:35.884572 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:01:35.884732 kubelet[2593]: E0213 10:01:35.884607 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:35.884732 kubelet[2593]: E0213 10:01:35.884533 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:01:35.884732 kubelet[2593]: E0213 10:01:35.884623 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:01:35.884732 kubelet[2593]: E0213 10:01:35.884639 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:01:35.884866 kubelet[2593]: E0213 10:01:35.884659 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:35.884866 kubelet[2593]: E0213 10:01:35.884675 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:01:36.681706 systemd[1]: Started sshd@12-139.178.70.43:22-139.178.68.195:43916.service. Feb 13 10:01:36.680000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-139.178.70.43:22-139.178.68.195:43916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:36.708604 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:01:36.708701 kernel: audit: type=1130 audit(1707818496.680:1264): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-139.178.70.43:22-139.178.68.195:43916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:36.818000 audit[7030]: USER_ACCT pid=7030 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:36.819418 sshd[7030]: Accepted publickey for core from 139.178.68.195 port 43916 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:36.820642 sshd[7030]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:36.822930 systemd-logind[1461]: New session 15 of user core. Feb 13 10:01:36.823367 systemd[1]: Started session-15.scope. Feb 13 10:01:36.902454 sshd[7030]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:36.903854 systemd[1]: sshd@12-139.178.70.43:22-139.178.68.195:43916.service: Deactivated successfully. Feb 13 10:01:36.904285 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 10:01:36.904608 systemd-logind[1461]: Session 15 logged out. Waiting for processes to exit. Feb 13 10:01:36.905020 systemd-logind[1461]: Removed session 15. Feb 13 10:01:36.819000 audit[7030]: CRED_ACQ pid=7030 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:37.001319 kernel: audit: type=1101 audit(1707818496.818:1265): pid=7030 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:37.001361 kernel: audit: type=1103 audit(1707818496.819:1266): pid=7030 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:37.001380 kernel: audit: type=1006 audit(1707818496.819:1267): pid=7030 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Feb 13 10:01:37.059893 kernel: audit: type=1300 audit(1707818496.819:1267): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe582b44a0 a2=3 a3=0 items=0 ppid=1 pid=7030 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:36.819000 audit[7030]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe582b44a0 a2=3 a3=0 items=0 ppid=1 pid=7030 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:37.151889 kernel: audit: type=1327 audit(1707818496.819:1267): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:36.819000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:37.182532 kernel: audit: type=1105 audit(1707818496.824:1268): pid=7030 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:36.824000 audit[7030]: USER_START pid=7030 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:37.277039 kernel: audit: type=1103 audit(1707818496.825:1269): pid=7032 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:36.825000 audit[7032]: CRED_ACQ pid=7032 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:36.901000 audit[7030]: USER_END pid=7030 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:37.461845 kernel: audit: type=1106 audit(1707818496.901:1270): pid=7030 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:37.461879 kernel: audit: type=1104 audit(1707818496.901:1271): pid=7030 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:36.901000 audit[7030]: CRED_DISP pid=7030 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:36.902000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-139.178.70.43:22-139.178.68.195:43916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:37.855200 env[1473]: time="2024-02-13T10:01:37.854961620Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:01:37.884459 env[1473]: time="2024-02-13T10:01:37.884425184Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:37.884729 kubelet[2593]: E0213 10:01:37.884695 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:01:37.884913 kubelet[2593]: E0213 10:01:37.884736 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:01:37.884913 kubelet[2593]: E0213 10:01:37.884761 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:37.884913 kubelet[2593]: E0213 10:01:37.884778 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:01:41.910844 systemd[1]: Started sshd@13-139.178.70.43:22-139.178.68.195:43918.service. Feb 13 10:01:41.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-139.178.70.43:22-139.178.68.195:43918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:41.937939 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:01:41.937997 kernel: audit: type=1130 audit(1707818501.909:1273): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-139.178.70.43:22-139.178.68.195:43918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:42.045000 audit[7087]: USER_ACCT pid=7087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.047504 sshd[7087]: Accepted publickey for core from 139.178.68.195 port 43918 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:42.049578 sshd[7087]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:42.052041 systemd-logind[1461]: New session 16 of user core. Feb 13 10:01:42.052527 systemd[1]: Started session-16.scope. Feb 13 10:01:42.131945 sshd[7087]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:42.133538 systemd[1]: sshd@13-139.178.70.43:22-139.178.68.195:43918.service: Deactivated successfully. Feb 13 10:01:42.134009 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 10:01:42.134300 systemd-logind[1461]: Session 16 logged out. Waiting for processes to exit. Feb 13 10:01:42.134874 systemd-logind[1461]: Removed session 16. Feb 13 10:01:42.048000 audit[7087]: CRED_ACQ pid=7087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.231766 kernel: audit: type=1101 audit(1707818502.045:1274): pid=7087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.231805 kernel: audit: type=1103 audit(1707818502.048:1275): pid=7087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.231822 kernel: audit: type=1006 audit(1707818502.048:1276): pid=7087 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Feb 13 10:01:42.290423 kernel: audit: type=1300 audit(1707818502.048:1276): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd300272b0 a2=3 a3=0 items=0 ppid=1 pid=7087 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:42.048000 audit[7087]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd300272b0 a2=3 a3=0 items=0 ppid=1 pid=7087 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:42.382414 kernel: audit: type=1327 audit(1707818502.048:1276): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:42.048000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:42.413031 kernel: audit: type=1105 audit(1707818502.053:1277): pid=7087 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.053000 audit[7087]: USER_START pid=7087 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.054000 audit[7089]: CRED_ACQ pid=7089 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.596776 kernel: audit: type=1103 audit(1707818502.054:1278): pid=7089 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.596812 kernel: audit: type=1106 audit(1707818502.131:1279): pid=7087 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.131000 audit[7087]: USER_END pid=7087 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.131000 audit[7087]: CRED_DISP pid=7087 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.781643 kernel: audit: type=1104 audit(1707818502.131:1280): pid=7087 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:42.132000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-139.178.70.43:22-139.178.68.195:43918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:42.854718 env[1473]: time="2024-02-13T10:01:42.854528103Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:01:42.879477 env[1473]: time="2024-02-13T10:01:42.879415280Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:42.879628 kubelet[2593]: E0213 10:01:42.879583 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:01:42.879628 kubelet[2593]: E0213 10:01:42.879609 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:01:42.879628 kubelet[2593]: E0213 10:01:42.879629 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:42.879873 kubelet[2593]: E0213 10:01:42.879647 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:01:47.145544 systemd[1]: Started sshd@14-139.178.70.43:22-139.178.68.195:58086.service. Feb 13 10:01:47.144000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-139.178.70.43:22-139.178.68.195:58086 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:47.186467 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:01:47.186578 kernel: audit: type=1130 audit(1707818507.144:1282): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-139.178.70.43:22-139.178.68.195:58086 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:47.294000 audit[7142]: USER_ACCT pid=7142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.295399 sshd[7142]: Accepted publickey for core from 139.178.68.195 port 58086 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:47.296621 sshd[7142]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:47.298782 systemd-logind[1461]: New session 17 of user core. Feb 13 10:01:47.299273 systemd[1]: Started session-17.scope. Feb 13 10:01:47.295000 audit[7142]: CRED_ACQ pid=7142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.477175 kernel: audit: type=1101 audit(1707818507.294:1283): pid=7142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.477211 kernel: audit: type=1103 audit(1707818507.295:1284): pid=7142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.477242 kernel: audit: type=1006 audit(1707818507.295:1285): pid=7142 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Feb 13 10:01:47.535788 kernel: audit: type=1300 audit(1707818507.295:1285): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0b6178b0 a2=3 a3=0 items=0 ppid=1 pid=7142 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:47.295000 audit[7142]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0b6178b0 a2=3 a3=0 items=0 ppid=1 pid=7142 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:47.295000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:47.628101 sshd[7142]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:47.629540 systemd[1]: sshd@14-139.178.70.43:22-139.178.68.195:58086.service: Deactivated successfully. Feb 13 10:01:47.629951 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 10:01:47.630281 systemd-logind[1461]: Session 17 logged out. Waiting for processes to exit. Feb 13 10:01:47.630829 systemd-logind[1461]: Removed session 17. Feb 13 10:01:47.658421 kernel: audit: type=1327 audit(1707818507.295:1285): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:47.658460 kernel: audit: type=1105 audit(1707818507.300:1286): pid=7142 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.300000 audit[7142]: USER_START pid=7142 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.753086 kernel: audit: type=1103 audit(1707818507.300:1287): pid=7144 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.300000 audit[7144]: CRED_ACQ pid=7144 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.842352 kernel: audit: type=1106 audit(1707818507.627:1288): pid=7142 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.627000 audit[7142]: USER_END pid=7142 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.853887 env[1473]: time="2024-02-13T10:01:47.853843519Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:01:47.865598 env[1473]: time="2024-02-13T10:01:47.865536485Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:47.865712 kubelet[2593]: E0213 10:01:47.865700 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:01:47.865894 kubelet[2593]: E0213 10:01:47.865729 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:01:47.865894 kubelet[2593]: E0213 10:01:47.865752 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:47.865894 kubelet[2593]: E0213 10:01:47.865771 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:01:47.937925 kernel: audit: type=1104 audit(1707818507.627:1289): pid=7142 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.627000 audit[7142]: CRED_DISP pid=7142 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:47.628000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-139.178.70.43:22-139.178.68.195:58086 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:48.853548 env[1473]: time="2024-02-13T10:01:48.853507621Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:01:48.868579 env[1473]: time="2024-02-13T10:01:48.868515741Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:48.868814 kubelet[2593]: E0213 10:01:48.868699 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:01:48.868814 kubelet[2593]: E0213 10:01:48.868728 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:01:48.868814 kubelet[2593]: E0213 10:01:48.868752 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:48.868814 kubelet[2593]: E0213 10:01:48.868770 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:01:52.582618 systemd[1]: Started sshd@15-139.178.70.43:22-139.178.68.195:58092.service. Feb 13 10:01:52.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-139.178.70.43:22-139.178.68.195:58092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:52.618923 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:01:52.619027 kernel: audit: type=1130 audit(1707818512.581:1291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-139.178.70.43:22-139.178.68.195:58092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:52.726000 audit[7228]: USER_ACCT pid=7228 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.727861 sshd[7228]: Accepted publickey for core from 139.178.68.195 port 58092 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:52.729044 sshd[7228]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:52.731362 systemd-logind[1461]: New session 18 of user core. Feb 13 10:01:52.731876 systemd[1]: Started session-18.scope. Feb 13 10:01:52.809376 sshd[7228]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:52.810834 systemd[1]: sshd@15-139.178.70.43:22-139.178.68.195:58092.service: Deactivated successfully. Feb 13 10:01:52.811249 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 10:01:52.811565 systemd-logind[1461]: Session 18 logged out. Waiting for processes to exit. Feb 13 10:01:52.811975 systemd-logind[1461]: Removed session 18. Feb 13 10:01:52.727000 audit[7228]: CRED_ACQ pid=7228 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.854137 env[1473]: time="2024-02-13T10:01:52.854088446Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:01:52.865708 env[1473]: time="2024-02-13T10:01:52.865646715Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:52.865825 kubelet[2593]: E0213 10:01:52.865809 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:01:52.865997 kubelet[2593]: E0213 10:01:52.865844 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:01:52.865997 kubelet[2593]: E0213 10:01:52.865872 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:52.865997 kubelet[2593]: E0213 10:01:52.865892 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:01:52.909606 kernel: audit: type=1101 audit(1707818512.726:1292): pid=7228 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.909646 kernel: audit: type=1103 audit(1707818512.727:1293): pid=7228 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.909661 kernel: audit: type=1006 audit(1707818512.727:1294): pid=7228 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Feb 13 10:01:52.727000 audit[7228]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd119ae740 a2=3 a3=0 items=0 ppid=1 pid=7228 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:53.060256 kernel: audit: type=1300 audit(1707818512.727:1294): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd119ae740 a2=3 a3=0 items=0 ppid=1 pid=7228 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:53.060338 kernel: audit: type=1327 audit(1707818512.727:1294): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:52.727000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:53.090846 kernel: audit: type=1105 audit(1707818512.732:1295): pid=7228 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.732000 audit[7228]: USER_START pid=7228 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:53.185417 kernel: audit: type=1103 audit(1707818512.732:1296): pid=7230 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.732000 audit[7230]: CRED_ACQ pid=7230 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:53.274715 kernel: audit: type=1106 audit(1707818512.808:1297): pid=7228 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.808000 audit[7228]: USER_END pid=7228 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:53.370317 kernel: audit: type=1104 audit(1707818512.808:1298): pid=7228 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.808000 audit[7228]: CRED_DISP pid=7228 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:52.809000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-139.178.70.43:22-139.178.68.195:58092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:54.854886 env[1473]: time="2024-02-13T10:01:54.854765936Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:01:54.907901 env[1473]: time="2024-02-13T10:01:54.907822354Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:01:54.908173 kubelet[2593]: E0213 10:01:54.908128 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:01:54.908537 kubelet[2593]: E0213 10:01:54.908177 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:01:54.908537 kubelet[2593]: E0213 10:01:54.908229 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:01:54.908537 kubelet[2593]: E0213 10:01:54.908271 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:01:55.206000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:55.206000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00103b120 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:55.206000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:55.206000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:55.206000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0024bb710 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:01:55.206000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:01:55.436000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:55.436000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c0002c38c0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:01:55.436000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:01:55.436000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:55.436000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c015ddc8a0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:01:55.436000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:01:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c01128ae40 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:01:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:01:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c010a5da70 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:01:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:01:55.439000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:55.439000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c002c12160 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:01:55.439000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:01:55.439000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:01:55.439000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=69 a1=c00b08a750 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:01:55.439000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:01:57.818621 systemd[1]: Started sshd@16-139.178.70.43:22-139.178.68.195:58904.service. Feb 13 10:01:57.817000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-139.178.70.43:22-139.178.68.195:58904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:57.845806 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 10:01:57.845914 kernel: audit: type=1130 audit(1707818517.817:1308): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-139.178.70.43:22-139.178.68.195:58904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:01:57.954000 audit[7314]: USER_ACCT pid=7314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:57.955107 sshd[7314]: Accepted publickey for core from 139.178.68.195 port 58904 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:01:57.963108 sshd[7314]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:01:57.965366 systemd-logind[1461]: New session 19 of user core. Feb 13 10:01:57.965817 systemd[1]: Started session-19.scope. Feb 13 10:01:58.044155 sshd[7314]: pam_unix(sshd:session): session closed for user core Feb 13 10:01:58.045609 systemd[1]: sshd@16-139.178.70.43:22-139.178.68.195:58904.service: Deactivated successfully. Feb 13 10:01:58.046033 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 10:01:58.046344 systemd-logind[1461]: Session 19 logged out. Waiting for processes to exit. Feb 13 10:01:57.955000 audit[7314]: CRED_ACQ pid=7314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.047113 systemd-logind[1461]: Removed session 19. Feb 13 10:01:58.137414 kernel: audit: type=1101 audit(1707818517.954:1309): pid=7314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.137450 kernel: audit: type=1103 audit(1707818517.955:1310): pid=7314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.137468 kernel: audit: type=1006 audit(1707818517.955:1311): pid=7314 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Feb 13 10:01:58.196048 kernel: audit: type=1300 audit(1707818517.955:1311): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc2f85dbe0 a2=3 a3=0 items=0 ppid=1 pid=7314 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:57.955000 audit[7314]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc2f85dbe0 a2=3 a3=0 items=0 ppid=1 pid=7314 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:01:58.288108 kernel: audit: type=1327 audit(1707818517.955:1311): proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:57.955000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:01:58.318642 kernel: audit: type=1105 audit(1707818517.966:1312): pid=7314 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:57.966000 audit[7314]: USER_START pid=7314 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.413207 kernel: audit: type=1103 audit(1707818517.967:1313): pid=7316 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:57.967000 audit[7316]: CRED_ACQ pid=7316 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.043000 audit[7314]: USER_END pid=7314 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.598136 kernel: audit: type=1106 audit(1707818518.043:1314): pid=7314 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.598165 kernel: audit: type=1104 audit(1707818518.043:1315): pid=7314 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.043000 audit[7314]: CRED_DISP pid=7314 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:01:58.044000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-139.178.70.43:22-139.178.68.195:58904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:02.854981 env[1473]: time="2024-02-13T10:02:02.854839796Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:02:02.883398 env[1473]: time="2024-02-13T10:02:02.883318284Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:02.883548 kubelet[2593]: E0213 10:02:02.883535 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:02:02.883710 kubelet[2593]: E0213 10:02:02.883564 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:02:02.883710 kubelet[2593]: E0213 10:02:02.883585 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:02.883710 kubelet[2593]: E0213 10:02:02.883600 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:02:03.053340 systemd[1]: Started sshd@17-139.178.70.43:22-139.178.68.195:58912.service. Feb 13 10:02:03.052000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-139.178.70.43:22-139.178.68.195:58912 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:03.080395 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:02:03.080500 kernel: audit: type=1130 audit(1707818523.052:1317): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-139.178.70.43:22-139.178.68.195:58912 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:03.195000 audit[7372]: USER_ACCT pid=7372 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.196525 sshd[7372]: Accepted publickey for core from 139.178.68.195 port 58912 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:03.197648 sshd[7372]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:03.199999 systemd-logind[1461]: New session 20 of user core. Feb 13 10:02:03.200503 systemd[1]: Started session-20.scope. Feb 13 10:02:03.196000 audit[7372]: CRED_ACQ pid=7372 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.378427 kernel: audit: type=1101 audit(1707818523.195:1318): pid=7372 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.378460 kernel: audit: type=1103 audit(1707818523.196:1319): pid=7372 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.378475 kernel: audit: type=1006 audit(1707818523.196:1320): pid=7372 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Feb 13 10:02:03.437093 kernel: audit: type=1300 audit(1707818523.196:1320): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe0e3d3b90 a2=3 a3=0 items=0 ppid=1 pid=7372 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:03.196000 audit[7372]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe0e3d3b90 a2=3 a3=0 items=0 ppid=1 pid=7372 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:03.475137 systemd[1]: Started sshd@18-139.178.70.43:22-139.178.68.195:58916.service. Feb 13 10:02:03.196000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:03.529362 sshd[7372]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:03.530662 systemd[1]: sshd@17-139.178.70.43:22-139.178.68.195:58912.service: Deactivated successfully. Feb 13 10:02:03.531048 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 10:02:03.531363 systemd-logind[1461]: Session 20 logged out. Waiting for processes to exit. Feb 13 10:02:03.531985 systemd-logind[1461]: Removed session 20. Feb 13 10:02:03.559662 kernel: audit: type=1327 audit(1707818523.196:1320): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:03.559695 kernel: audit: type=1105 audit(1707818523.201:1321): pid=7372 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.201000 audit[7372]: USER_START pid=7372 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.580234 sshd[7397]: Accepted publickey for core from 139.178.68.195 port 58916 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:03.583656 sshd[7397]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:03.585921 systemd-logind[1461]: New session 21 of user core. Feb 13 10:02:03.586430 systemd[1]: Started session-21.scope. Feb 13 10:02:03.202000 audit[7374]: CRED_ACQ pid=7374 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.743609 kernel: audit: type=1103 audit(1707818523.202:1322): pid=7374 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.743700 kernel: audit: type=1130 audit(1707818523.474:1323): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-139.178.70.43:22-139.178.68.195:58916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:03.474000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-139.178.70.43:22-139.178.68.195:58916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:03.832417 kernel: audit: type=1106 audit(1707818523.528:1324): pid=7372 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.528000 audit[7372]: USER_END pid=7372 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.858300 env[1473]: time="2024-02-13T10:02:03.855372353Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:02:03.870261 env[1473]: time="2024-02-13T10:02:03.870201706Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:03.870392 kubelet[2593]: E0213 10:02:03.870369 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:02:03.870431 kubelet[2593]: E0213 10:02:03.870400 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:02:03.870431 kubelet[2593]: E0213 10:02:03.870425 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:03.870495 kubelet[2593]: E0213 10:02:03.870443 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:02:03.528000 audit[7372]: CRED_DISP pid=7372 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.530000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-139.178.70.43:22-139.178.68.195:58912 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:03.579000 audit[7397]: USER_ACCT pid=7397 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.583000 audit[7397]: CRED_ACQ pid=7397 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.583000 audit[7397]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3d8f8440 a2=3 a3=0 items=0 ppid=1 pid=7397 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:03.583000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:03.587000 audit[7397]: USER_START pid=7397 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:03.588000 audit[7400]: CRED_ACQ pid=7400 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.138260 sshd[7397]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:04.137000 audit[7397]: USER_END pid=7397 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.138000 audit[7397]: CRED_DISP pid=7397 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.140970 systemd[1]: sshd@18-139.178.70.43:22-139.178.68.195:58916.service: Deactivated successfully. Feb 13 10:02:04.139000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-139.178.70.43:22-139.178.68.195:58916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:04.141422 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 10:02:04.141937 systemd-logind[1461]: Session 21 logged out. Waiting for processes to exit. Feb 13 10:02:04.142657 systemd[1]: Started sshd@19-139.178.70.43:22-139.178.68.195:58926.service. Feb 13 10:02:04.141000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-139.178.70.43:22-139.178.68.195:58926 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:04.143164 systemd-logind[1461]: Removed session 21. Feb 13 10:02:04.169000 audit[7452]: USER_ACCT pid=7452 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.169849 sshd[7452]: Accepted publickey for core from 139.178.68.195 port 58926 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:04.169000 audit[7452]: CRED_ACQ pid=7452 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.169000 audit[7452]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc8c6d110 a2=3 a3=0 items=0 ppid=1 pid=7452 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:04.169000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:04.170571 sshd[7452]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:04.173004 systemd-logind[1461]: New session 22 of user core. Feb 13 10:02:04.173499 systemd[1]: Started session-22.scope. Feb 13 10:02:04.174000 audit[7452]: USER_START pid=7452 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.174000 audit[7454]: CRED_ACQ pid=7454 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.284908 sshd[7452]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:04.284000 audit[7452]: USER_END pid=7452 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.284000 audit[7452]: CRED_DISP pid=7452 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:04.286666 systemd[1]: sshd@19-139.178.70.43:22-139.178.68.195:58926.service: Deactivated successfully. Feb 13 10:02:04.286000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-139.178.70.43:22-139.178.68.195:58926 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:04.287175 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 10:02:04.287675 systemd-logind[1461]: Session 22 logged out. Waiting for processes to exit. Feb 13 10:02:04.288260 systemd-logind[1461]: Removed session 22. Feb 13 10:02:05.854855 env[1473]: time="2024-02-13T10:02:05.854714326Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:02:05.879580 env[1473]: time="2024-02-13T10:02:05.879517343Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:05.879737 kubelet[2593]: E0213 10:02:05.879716 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:02:05.879897 kubelet[2593]: E0213 10:02:05.879743 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:02:05.879897 kubelet[2593]: E0213 10:02:05.879766 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:05.879897 kubelet[2593]: E0213 10:02:05.879783 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:02:06.854241 env[1473]: time="2024-02-13T10:02:06.854130406Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:02:06.881257 env[1473]: time="2024-02-13T10:02:06.881199456Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:06.881561 kubelet[2593]: E0213 10:02:06.881431 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:02:06.881561 kubelet[2593]: E0213 10:02:06.881463 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:02:06.881561 kubelet[2593]: E0213 10:02:06.881493 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:06.881561 kubelet[2593]: E0213 10:02:06.881518 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:02:09.294942 systemd[1]: Started sshd@20-139.178.70.43:22-139.178.68.195:49456.service. Feb 13 10:02:09.293000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-139.178.70.43:22-139.178.68.195:49456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:09.322271 kernel: kauditd_printk_skb: 23 callbacks suppressed Feb 13 10:02:09.322310 kernel: audit: type=1130 audit(1707818529.293:1344): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-139.178.70.43:22-139.178.68.195:49456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:09.432000 audit[7536]: USER_ACCT pid=7536 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.432841 sshd[7536]: Accepted publickey for core from 139.178.68.195 port 49456 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:09.433630 sshd[7536]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:09.435863 systemd-logind[1461]: New session 23 of user core. Feb 13 10:02:09.436301 systemd[1]: Started session-23.scope. Feb 13 10:02:09.514761 sshd[7536]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:09.516158 systemd[1]: sshd@20-139.178.70.43:22-139.178.68.195:49456.service: Deactivated successfully. Feb 13 10:02:09.516573 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 10:02:09.516894 systemd-logind[1461]: Session 23 logged out. Waiting for processes to exit. Feb 13 10:02:09.517242 systemd-logind[1461]: Removed session 23. Feb 13 10:02:09.432000 audit[7536]: CRED_ACQ pid=7536 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.615590 kernel: audit: type=1101 audit(1707818529.432:1345): pid=7536 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.615632 kernel: audit: type=1103 audit(1707818529.432:1346): pid=7536 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.615654 kernel: audit: type=1006 audit(1707818529.432:1347): pid=7536 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Feb 13 10:02:09.674564 kernel: audit: type=1300 audit(1707818529.432:1347): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc89fe34b0 a2=3 a3=0 items=0 ppid=1 pid=7536 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:09.432000 audit[7536]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc89fe34b0 a2=3 a3=0 items=0 ppid=1 pid=7536 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:09.767076 kernel: audit: type=1327 audit(1707818529.432:1347): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:09.432000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:09.797759 kernel: audit: type=1105 audit(1707818529.437:1348): pid=7536 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.437000 audit[7536]: USER_START pid=7536 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.892775 kernel: audit: type=1103 audit(1707818529.437:1349): pid=7538 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.437000 audit[7538]: CRED_ACQ pid=7538 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.982579 kernel: audit: type=1106 audit(1707818529.514:1350): pid=7536 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.514000 audit[7536]: USER_END pid=7536 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:10.078619 kernel: audit: type=1104 audit(1707818529.514:1351): pid=7536 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.514000 audit[7536]: CRED_DISP pid=7536 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:09.514000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-139.178.70.43:22-139.178.68.195:49456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:09.614000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:09.614000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006f5260 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:02:09.614000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:02:09.616000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:09.616000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0012101a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:02:09.616000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:02:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0012c4ca0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:02:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:02:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0012c4cc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:02:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:02:14.524793 systemd[1]: Started sshd@21-139.178.70.43:22-139.178.68.195:49472.service. Feb 13 10:02:14.524000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-139.178.70.43:22-139.178.68.195:49472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:14.552110 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 10:02:14.552189 kernel: audit: type=1130 audit(1707818534.524:1357): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-139.178.70.43:22-139.178.68.195:49472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:14.661258 sshd[7563]: Accepted publickey for core from 139.178.68.195 port 49472 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:14.660000 audit[7563]: USER_ACCT pid=7563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:14.662778 sshd[7563]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:14.665040 systemd-logind[1461]: New session 24 of user core. Feb 13 10:02:14.665549 systemd[1]: Started session-24.scope. Feb 13 10:02:14.744821 sshd[7563]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:14.746302 systemd[1]: sshd@21-139.178.70.43:22-139.178.68.195:49472.service: Deactivated successfully. Feb 13 10:02:14.746721 systemd[1]: session-24.scope: Deactivated successfully. Feb 13 10:02:14.747105 systemd-logind[1461]: Session 24 logged out. Waiting for processes to exit. Feb 13 10:02:14.747579 systemd-logind[1461]: Removed session 24. Feb 13 10:02:14.661000 audit[7563]: CRED_ACQ pid=7563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:14.844127 kernel: audit: type=1101 audit(1707818534.660:1358): pid=7563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:14.844172 kernel: audit: type=1103 audit(1707818534.661:1359): pid=7563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:14.844220 kernel: audit: type=1006 audit(1707818534.661:1360): pid=7563 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Feb 13 10:02:14.661000 audit[7563]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe4f83d30 a2=3 a3=0 items=0 ppid=1 pid=7563 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:14.995580 kernel: audit: type=1300 audit(1707818534.661:1360): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe4f83d30 a2=3 a3=0 items=0 ppid=1 pid=7563 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:14.995624 kernel: audit: type=1327 audit(1707818534.661:1360): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:14.661000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:14.666000 audit[7563]: USER_START pid=7563 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:15.121388 kernel: audit: type=1105 audit(1707818534.666:1361): pid=7563 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:15.121437 kernel: audit: type=1103 audit(1707818534.667:1362): pid=7566 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:14.667000 audit[7566]: CRED_ACQ pid=7566 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:14.744000 audit[7563]: USER_END pid=7563 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:15.306201 kernel: audit: type=1106 audit(1707818534.744:1363): pid=7563 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:15.306255 kernel: audit: type=1104 audit(1707818534.744:1364): pid=7563 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:14.744000 audit[7563]: CRED_DISP pid=7563 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:14.745000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-139.178.70.43:22-139.178.68.195:49472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:16.854931 env[1473]: time="2024-02-13T10:02:16.854805753Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:02:16.883121 env[1473]: time="2024-02-13T10:02:16.883059365Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:16.883264 kubelet[2593]: E0213 10:02:16.883254 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:02:16.883473 kubelet[2593]: E0213 10:02:16.883280 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:02:16.883473 kubelet[2593]: E0213 10:02:16.883303 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:16.883473 kubelet[2593]: E0213 10:02:16.883321 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:02:17.854492 env[1473]: time="2024-02-13T10:02:17.854335335Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:02:17.881187 env[1473]: time="2024-02-13T10:02:17.881135910Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:17.881439 kubelet[2593]: E0213 10:02:17.881246 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:02:17.881439 kubelet[2593]: E0213 10:02:17.881269 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:02:17.881439 kubelet[2593]: E0213 10:02:17.881289 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:17.881439 kubelet[2593]: E0213 10:02:17.881307 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:02:19.754219 systemd[1]: Started sshd@22-139.178.70.43:22-139.178.68.195:52658.service. Feb 13 10:02:19.753000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-139.178.70.43:22-139.178.68.195:52658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:19.780738 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:02:19.780815 kernel: audit: type=1130 audit(1707818539.753:1366): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-139.178.70.43:22-139.178.68.195:52658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:19.853352 env[1473]: time="2024-02-13T10:02:19.853320959Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:02:19.865794 env[1473]: time="2024-02-13T10:02:19.865754519Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:19.866001 kubelet[2593]: E0213 10:02:19.865987 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:02:19.866413 kubelet[2593]: E0213 10:02:19.866016 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:02:19.866413 kubelet[2593]: E0213 10:02:19.866042 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:19.866413 kubelet[2593]: E0213 10:02:19.866059 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:02:19.890000 audit[7644]: USER_ACCT pid=7644 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:19.891486 sshd[7644]: Accepted publickey for core from 139.178.68.195 port 52658 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:19.893355 sshd[7644]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:19.895701 systemd-logind[1461]: New session 25 of user core. Feb 13 10:02:19.896149 systemd[1]: Started session-25.scope. Feb 13 10:02:19.974820 sshd[7644]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:19.976205 systemd[1]: sshd@22-139.178.70.43:22-139.178.68.195:52658.service: Deactivated successfully. Feb 13 10:02:19.976631 systemd[1]: session-25.scope: Deactivated successfully. Feb 13 10:02:19.976937 systemd-logind[1461]: Session 25 logged out. Waiting for processes to exit. Feb 13 10:02:19.977319 systemd-logind[1461]: Removed session 25. Feb 13 10:02:19.891000 audit[7644]: CRED_ACQ pid=7644 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:20.073261 kernel: audit: type=1101 audit(1707818539.890:1367): pid=7644 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:20.073321 kernel: audit: type=1103 audit(1707818539.891:1368): pid=7644 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:20.073370 kernel: audit: type=1006 audit(1707818539.891:1369): pid=7644 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Feb 13 10:02:19.891000 audit[7644]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdc84d9df0 a2=3 a3=0 items=0 ppid=1 pid=7644 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:20.223867 kernel: audit: type=1300 audit(1707818539.891:1369): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdc84d9df0 a2=3 a3=0 items=0 ppid=1 pid=7644 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:20.223966 kernel: audit: type=1327 audit(1707818539.891:1369): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:19.891000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:19.897000 audit[7644]: USER_START pid=7644 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:20.348846 kernel: audit: type=1105 audit(1707818539.897:1370): pid=7644 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:20.348884 kernel: audit: type=1103 audit(1707818539.898:1371): pid=7676 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:19.898000 audit[7676]: CRED_ACQ pid=7676 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:19.974000 audit[7644]: USER_END pid=7644 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:20.533480 kernel: audit: type=1106 audit(1707818539.974:1372): pid=7644 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:20.533526 kernel: audit: type=1104 audit(1707818539.974:1373): pid=7644 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:19.974000 audit[7644]: CRED_DISP pid=7644 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:19.974000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-139.178.70.43:22-139.178.68.195:52658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:20.854742 env[1473]: time="2024-02-13T10:02:20.854515639Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:02:20.882780 env[1473]: time="2024-02-13T10:02:20.882747422Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:20.882961 kubelet[2593]: E0213 10:02:20.882923 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:02:20.882961 kubelet[2593]: E0213 10:02:20.882948 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:02:20.883162 kubelet[2593]: E0213 10:02:20.882970 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:20.883162 kubelet[2593]: E0213 10:02:20.882988 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:02:24.986074 systemd[1]: Started sshd@23-139.178.70.43:22-139.178.68.195:52666.service. Feb 13 10:02:24.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-139.178.70.43:22-139.178.68.195:52666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:25.027850 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:02:25.027962 kernel: audit: type=1130 audit(1707818544.986:1375): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-139.178.70.43:22-139.178.68.195:52666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:25.136000 audit[7728]: USER_ACCT pid=7728 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.136745 sshd[7728]: Accepted publickey for core from 139.178.68.195 port 52666 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:25.137637 sshd[7728]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:25.139939 systemd-logind[1461]: New session 26 of user core. Feb 13 10:02:25.140495 systemd[1]: Started session-26.scope. Feb 13 10:02:25.219334 sshd[7728]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:25.220822 systemd[1]: sshd@23-139.178.70.43:22-139.178.68.195:52666.service: Deactivated successfully. Feb 13 10:02:25.221247 systemd[1]: session-26.scope: Deactivated successfully. Feb 13 10:02:25.221607 systemd-logind[1461]: Session 26 logged out. Waiting for processes to exit. Feb 13 10:02:25.222032 systemd-logind[1461]: Removed session 26. Feb 13 10:02:25.137000 audit[7728]: CRED_ACQ pid=7728 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.319175 kernel: audit: type=1101 audit(1707818545.136:1376): pid=7728 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.319212 kernel: audit: type=1103 audit(1707818545.137:1377): pid=7728 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.319228 kernel: audit: type=1006 audit(1707818545.137:1378): pid=7728 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Feb 13 10:02:25.377744 kernel: audit: type=1300 audit(1707818545.137:1378): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5e555160 a2=3 a3=0 items=0 ppid=1 pid=7728 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:25.137000 audit[7728]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5e555160 a2=3 a3=0 items=0 ppid=1 pid=7728 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:25.469753 kernel: audit: type=1327 audit(1707818545.137:1378): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:25.137000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:25.500272 kernel: audit: type=1105 audit(1707818545.142:1379): pid=7728 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.142000 audit[7728]: USER_START pid=7728 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.594786 kernel: audit: type=1103 audit(1707818545.142:1380): pid=7730 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.142000 audit[7730]: CRED_ACQ pid=7730 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.684032 kernel: audit: type=1106 audit(1707818545.219:1381): pid=7728 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.219000 audit[7728]: USER_END pid=7728 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.779573 kernel: audit: type=1104 audit(1707818545.219:1382): pid=7728 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.219000 audit[7728]: CRED_DISP pid=7728 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:25.220000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-139.178.70.43:22-139.178.68.195:52666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:27.854778 env[1473]: time="2024-02-13T10:02:27.854666786Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:02:27.905828 env[1473]: time="2024-02-13T10:02:27.905725980Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:27.906063 kubelet[2593]: E0213 10:02:27.906033 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:02:27.906557 kubelet[2593]: E0213 10:02:27.906090 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:02:27.906557 kubelet[2593]: E0213 10:02:27.906173 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:27.906557 kubelet[2593]: E0213 10:02:27.906239 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:02:30.230131 systemd[1]: Started sshd@24-139.178.70.43:22-139.178.68.195:46140.service. Feb 13 10:02:30.230000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-139.178.70.43:22-139.178.68.195:46140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:30.257574 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:02:30.257663 kernel: audit: type=1130 audit(1707818550.230:1384): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-139.178.70.43:22-139.178.68.195:46140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:30.381000 audit[7781]: USER_ACCT pid=7781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.382473 sshd[7781]: Accepted publickey for core from 139.178.68.195 port 46140 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:30.385224 sshd[7781]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:30.392948 systemd-logind[1461]: New session 27 of user core. Feb 13 10:02:30.394734 systemd[1]: Started session-27.scope. Feb 13 10:02:30.384000 audit[7781]: CRED_ACQ pid=7781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.483573 sshd[7781]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:30.485046 systemd[1]: sshd@24-139.178.70.43:22-139.178.68.195:46140.service: Deactivated successfully. Feb 13 10:02:30.485509 systemd[1]: session-27.scope: Deactivated successfully. Feb 13 10:02:30.485962 systemd-logind[1461]: Session 27 logged out. Waiting for processes to exit. Feb 13 10:02:30.486355 systemd-logind[1461]: Removed session 27. Feb 13 10:02:30.565396 kernel: audit: type=1101 audit(1707818550.381:1385): pid=7781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.565444 kernel: audit: type=1103 audit(1707818550.384:1386): pid=7781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.565460 kernel: audit: type=1006 audit(1707818550.384:1387): pid=7781 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Feb 13 10:02:30.624054 kernel: audit: type=1300 audit(1707818550.384:1387): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7db788b0 a2=3 a3=0 items=0 ppid=1 pid=7781 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:30.384000 audit[7781]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7db788b0 a2=3 a3=0 items=0 ppid=1 pid=7781 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:30.716145 kernel: audit: type=1327 audit(1707818550.384:1387): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:30.384000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:30.746674 kernel: audit: type=1105 audit(1707818550.403:1388): pid=7781 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.403000 audit[7781]: USER_START pid=7781 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.841215 kernel: audit: type=1103 audit(1707818550.404:1389): pid=7783 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.404000 audit[7783]: CRED_ACQ pid=7783 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.930504 kernel: audit: type=1106 audit(1707818550.483:1390): pid=7781 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.483000 audit[7781]: USER_END pid=7781 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.484000 audit[7781]: CRED_DISP pid=7781 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:31.115626 kernel: audit: type=1104 audit(1707818550.484:1391): pid=7781 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:30.484000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-139.178.70.43:22-139.178.68.195:46140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:31.855186 env[1473]: time="2024-02-13T10:02:31.855023167Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:02:31.881415 env[1473]: time="2024-02-13T10:02:31.881327768Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:31.881573 kubelet[2593]: E0213 10:02:31.881539 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:02:31.881573 kubelet[2593]: E0213 10:02:31.881565 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:02:31.881763 kubelet[2593]: E0213 10:02:31.881586 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:31.881763 kubelet[2593]: E0213 10:02:31.881604 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:02:32.855224 env[1473]: time="2024-02-13T10:02:32.855079556Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:02:32.881518 env[1473]: time="2024-02-13T10:02:32.881459075Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:32.881686 kubelet[2593]: E0213 10:02:32.881675 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:02:32.881855 kubelet[2593]: E0213 10:02:32.881702 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:02:32.881855 kubelet[2593]: E0213 10:02:32.881726 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:32.881855 kubelet[2593]: E0213 10:02:32.881757 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:02:34.854327 env[1473]: time="2024-02-13T10:02:34.854219741Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:02:34.903187 env[1473]: time="2024-02-13T10:02:34.903082363Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:34.903422 kubelet[2593]: E0213 10:02:34.903376 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:02:34.903422 kubelet[2593]: E0213 10:02:34.903418 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:02:34.903790 kubelet[2593]: E0213 10:02:34.903463 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:34.903790 kubelet[2593]: E0213 10:02:34.903497 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:02:35.494593 systemd[1]: Started sshd@25-139.178.70.43:22-139.178.68.195:46154.service. Feb 13 10:02:35.494000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-139.178.70.43:22-139.178.68.195:46154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:35.522131 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:02:35.522238 kernel: audit: type=1130 audit(1707818555.494:1393): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-139.178.70.43:22-139.178.68.195:46154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:35.631000 audit[7894]: USER_ACCT pid=7894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.631995 sshd[7894]: Accepted publickey for core from 139.178.68.195 port 46154 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:35.632636 sshd[7894]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:35.634953 systemd-logind[1461]: New session 28 of user core. Feb 13 10:02:35.635516 systemd[1]: Started session-28.scope. Feb 13 10:02:35.714205 sshd[7894]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:35.715756 systemd[1]: sshd@25-139.178.70.43:22-139.178.68.195:46154.service: Deactivated successfully. Feb 13 10:02:35.716236 systemd[1]: session-28.scope: Deactivated successfully. Feb 13 10:02:35.716608 systemd-logind[1461]: Session 28 logged out. Waiting for processes to exit. Feb 13 10:02:35.717122 systemd-logind[1461]: Removed session 28. Feb 13 10:02:35.632000 audit[7894]: CRED_ACQ pid=7894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.813904 kernel: audit: type=1101 audit(1707818555.631:1394): pid=7894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.813946 kernel: audit: type=1103 audit(1707818555.632:1395): pid=7894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.813965 kernel: audit: type=1006 audit(1707818555.632:1396): pid=7894 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Feb 13 10:02:35.632000 audit[7894]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd70797df0 a2=3 a3=0 items=0 ppid=1 pid=7894 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:35.964570 kernel: audit: type=1300 audit(1707818555.632:1396): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd70797df0 a2=3 a3=0 items=0 ppid=1 pid=7894 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:35.964605 kernel: audit: type=1327 audit(1707818555.632:1396): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:35.632000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:35.995109 kernel: audit: type=1105 audit(1707818555.637:1397): pid=7894 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.637000 audit[7894]: USER_START pid=7894 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:36.089698 kernel: audit: type=1103 audit(1707818555.637:1398): pid=7896 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.637000 audit[7896]: CRED_ACQ pid=7896 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:36.179072 kernel: audit: type=1106 audit(1707818555.714:1399): pid=7894 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.714000 audit[7894]: USER_END pid=7894 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:36.274655 kernel: audit: type=1104 audit(1707818555.714:1400): pid=7894 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.714000 audit[7894]: CRED_DISP pid=7894 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:35.715000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-139.178.70.43:22-139.178.68.195:46154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:38.855326 env[1473]: time="2024-02-13T10:02:38.855209610Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:02:38.906235 env[1473]: time="2024-02-13T10:02:38.906138379Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:38.906490 kubelet[2593]: E0213 10:02:38.906431 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:02:38.906490 kubelet[2593]: E0213 10:02:38.906485 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:02:38.906972 kubelet[2593]: E0213 10:02:38.906544 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:38.906972 kubelet[2593]: E0213 10:02:38.906585 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:02:40.724416 systemd[1]: Started sshd@26-139.178.70.43:22-139.178.68.195:34056.service. Feb 13 10:02:40.724000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-139.178.70.43:22-139.178.68.195:34056 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:40.751549 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:02:40.751663 kernel: audit: type=1130 audit(1707818560.724:1402): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-139.178.70.43:22-139.178.68.195:34056 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:40.880000 audit[7949]: USER_ACCT pid=7949 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:40.881382 sshd[7949]: Accepted publickey for core from 139.178.68.195 port 34056 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:40.884875 sshd[7949]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:40.889330 systemd-logind[1461]: New session 29 of user core. Feb 13 10:02:40.890712 systemd[1]: Started session-29.scope. Feb 13 10:02:40.884000 audit[7949]: CRED_ACQ pid=7949 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:40.974646 sshd[7949]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:40.975833 systemd[1]: sshd@26-139.178.70.43:22-139.178.68.195:34056.service: Deactivated successfully. Feb 13 10:02:40.976261 systemd[1]: session-29.scope: Deactivated successfully. Feb 13 10:02:40.976548 systemd-logind[1461]: Session 29 logged out. Waiting for processes to exit. Feb 13 10:02:40.977001 systemd-logind[1461]: Removed session 29. Feb 13 10:02:41.063088 kernel: audit: type=1101 audit(1707818560.880:1403): pid=7949 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:41.063133 kernel: audit: type=1103 audit(1707818560.884:1404): pid=7949 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:41.063150 kernel: audit: type=1006 audit(1707818560.884:1405): pid=7949 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Feb 13 10:02:41.121692 kernel: audit: type=1300 audit(1707818560.884:1405): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe781dcaf0 a2=3 a3=0 items=0 ppid=1 pid=7949 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:40.884000 audit[7949]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe781dcaf0 a2=3 a3=0 items=0 ppid=1 pid=7949 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:41.213727 kernel: audit: type=1327 audit(1707818560.884:1405): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:40.884000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:41.244354 kernel: audit: type=1105 audit(1707818560.895:1406): pid=7949 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:40.895000 audit[7949]: USER_START pid=7949 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:41.338861 kernel: audit: type=1103 audit(1707818560.897:1407): pid=7951 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:40.897000 audit[7951]: CRED_ACQ pid=7951 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:41.428134 kernel: audit: type=1106 audit(1707818560.974:1408): pid=7949 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:40.974000 audit[7949]: USER_END pid=7949 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:41.523702 kernel: audit: type=1104 audit(1707818560.974:1409): pid=7949 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:40.974000 audit[7949]: CRED_DISP pid=7949 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:40.975000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-139.178.70.43:22-139.178.68.195:34056 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:44.855043 env[1473]: time="2024-02-13T10:02:44.854955750Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:02:44.881763 env[1473]: time="2024-02-13T10:02:44.881665502Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:44.881965 kubelet[2593]: E0213 10:02:44.881953 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:02:44.882132 kubelet[2593]: E0213 10:02:44.881981 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:02:44.882132 kubelet[2593]: E0213 10:02:44.882003 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:44.882132 kubelet[2593]: E0213 10:02:44.882020 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:02:45.854821 env[1473]: time="2024-02-13T10:02:45.854699130Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:02:45.880957 env[1473]: time="2024-02-13T10:02:45.880879348Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:45.881256 kubelet[2593]: E0213 10:02:45.881105 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:02:45.881256 kubelet[2593]: E0213 10:02:45.881146 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:02:45.881256 kubelet[2593]: E0213 10:02:45.881169 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:45.881256 kubelet[2593]: E0213 10:02:45.881188 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:02:45.983436 systemd[1]: Started sshd@27-139.178.70.43:22-139.178.68.195:34064.service. Feb 13 10:02:45.983000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-139.178.70.43:22-139.178.68.195:34064 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:46.009956 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:02:46.010000 kernel: audit: type=1130 audit(1707818565.983:1411): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-139.178.70.43:22-139.178.68.195:34064 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:46.120000 audit[8033]: USER_ACCT pid=8033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.120587 sshd[8033]: Accepted publickey for core from 139.178.68.195 port 34064 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:46.121616 sshd[8033]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:46.123961 systemd-logind[1461]: New session 30 of user core. Feb 13 10:02:46.124639 systemd[1]: Started session-30.scope. Feb 13 10:02:46.203943 sshd[8033]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:46.205303 systemd[1]: sshd@27-139.178.70.43:22-139.178.68.195:34064.service: Deactivated successfully. Feb 13 10:02:46.205719 systemd[1]: session-30.scope: Deactivated successfully. Feb 13 10:02:46.206102 systemd-logind[1461]: Session 30 logged out. Waiting for processes to exit. Feb 13 10:02:46.206676 systemd-logind[1461]: Removed session 30. Feb 13 10:02:46.121000 audit[8033]: CRED_ACQ pid=8033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.303461 kernel: audit: type=1101 audit(1707818566.120:1412): pid=8033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.303496 kernel: audit: type=1103 audit(1707818566.121:1413): pid=8033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.303516 kernel: audit: type=1006 audit(1707818566.121:1414): pid=8033 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=30 res=1 Feb 13 10:02:46.362783 kernel: audit: type=1300 audit(1707818566.121:1414): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6a490660 a2=3 a3=0 items=0 ppid=1 pid=8033 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:46.121000 audit[8033]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6a490660 a2=3 a3=0 items=0 ppid=1 pid=8033 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:46.454930 kernel: audit: type=1327 audit(1707818566.121:1414): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:46.121000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:46.485479 kernel: audit: type=1105 audit(1707818566.126:1415): pid=8033 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.126000 audit[8033]: USER_START pid=8033 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.579991 kernel: audit: type=1103 audit(1707818566.127:1416): pid=8035 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.127000 audit[8035]: CRED_ACQ pid=8035 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.204000 audit[8033]: USER_END pid=8033 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.764883 kernel: audit: type=1106 audit(1707818566.204:1417): pid=8033 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.764915 kernel: audit: type=1104 audit(1707818566.204:1418): pid=8033 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.204000 audit[8033]: CRED_DISP pid=8033 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:46.205000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-139.178.70.43:22-139.178.68.195:34064 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:48.855600 env[1473]: time="2024-02-13T10:02:48.855466151Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:02:48.898672 env[1473]: time="2024-02-13T10:02:48.898609518Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:48.898842 kubelet[2593]: E0213 10:02:48.898787 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:02:48.898842 kubelet[2593]: E0213 10:02:48.898813 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:02:48.898842 kubelet[2593]: E0213 10:02:48.898833 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:48.899063 kubelet[2593]: E0213 10:02:48.898851 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:02:51.210918 systemd[1]: Started sshd@28-139.178.70.43:22-139.178.68.195:55196.service. Feb 13 10:02:51.210000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-139.178.70.43:22-139.178.68.195:55196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:51.237879 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:02:51.237988 kernel: audit: type=1130 audit(1707818571.210:1420): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-139.178.70.43:22-139.178.68.195:55196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:51.346000 audit[8084]: USER_ACCT pid=8084 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.346777 sshd[8084]: Accepted publickey for core from 139.178.68.195 port 55196 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:51.348313 sshd[8084]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:51.350657 systemd-logind[1461]: New session 31 of user core. Feb 13 10:02:51.351122 systemd[1]: Started session-31.scope. Feb 13 10:02:51.431758 sshd[8084]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:51.433120 systemd[1]: sshd@28-139.178.70.43:22-139.178.68.195:55196.service: Deactivated successfully. Feb 13 10:02:51.433550 systemd[1]: session-31.scope: Deactivated successfully. Feb 13 10:02:51.433959 systemd-logind[1461]: Session 31 logged out. Waiting for processes to exit. Feb 13 10:02:51.434476 systemd-logind[1461]: Removed session 31. Feb 13 10:02:51.347000 audit[8084]: CRED_ACQ pid=8084 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.528606 kernel: audit: type=1101 audit(1707818571.346:1421): pid=8084 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.528696 kernel: audit: type=1103 audit(1707818571.347:1422): pid=8084 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.528713 kernel: audit: type=1006 audit(1707818571.347:1423): pid=8084 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=31 res=1 Feb 13 10:02:51.587234 kernel: audit: type=1300 audit(1707818571.347:1423): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5d461830 a2=3 a3=0 items=0 ppid=1 pid=8084 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:51.347000 audit[8084]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5d461830 a2=3 a3=0 items=0 ppid=1 pid=8084 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:51.347000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:51.709888 kernel: audit: type=1327 audit(1707818571.347:1423): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:51.709916 kernel: audit: type=1105 audit(1707818571.352:1424): pid=8084 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.352000 audit[8084]: USER_START pid=8084 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.353000 audit[8086]: CRED_ACQ pid=8086 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.854003 env[1473]: time="2024-02-13T10:02:51.853916831Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:02:51.865635 env[1473]: time="2024-02-13T10:02:51.865573648Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:51.865741 kubelet[2593]: E0213 10:02:51.865717 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:02:51.865901 kubelet[2593]: E0213 10:02:51.865745 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:02:51.865901 kubelet[2593]: E0213 10:02:51.865766 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:51.865901 kubelet[2593]: E0213 10:02:51.865783 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:02:51.893730 kernel: audit: type=1103 audit(1707818571.353:1425): pid=8086 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.893774 kernel: audit: type=1106 audit(1707818571.432:1426): pid=8084 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.432000 audit[8084]: USER_END pid=8084 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.989429 kernel: audit: type=1104 audit(1707818571.432:1427): pid=8084 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.432000 audit[8084]: CRED_DISP pid=8084 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:51.432000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-139.178.70.43:22-139.178.68.195:55196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:55.208000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:55.208000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:55.208000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0030078f0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:02:55.208000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:02:55.208000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0022aa460 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:02:55.208000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:02:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c010017b40 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:02:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:02:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c0150b8fc0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:02:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:02:55.439000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:55.439000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0150b9020 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:02:55.439000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:02:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0150b9080 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:02:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:02:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c013e60060 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:02:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:02:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:02:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=69 a1=c0150b90b0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:02:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:02:56.441340 systemd[1]: Started sshd@29-139.178.70.43:22-139.178.68.195:42554.service. Feb 13 10:02:56.441000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-139.178.70.43:22-139.178.68.195:42554 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:56.468346 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 10:02:56.468416 kernel: audit: type=1130 audit(1707818576.441:1437): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-139.178.70.43:22-139.178.68.195:42554 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:02:56.576000 audit[8137]: USER_ACCT pid=8137 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.576788 sshd[8137]: Accepted publickey for core from 139.178.68.195 port 42554 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:02:56.577893 sshd[8137]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:02:56.580187 systemd-logind[1461]: New session 32 of user core. Feb 13 10:02:56.580745 systemd[1]: Started session-32.scope. Feb 13 10:02:56.661801 sshd[8137]: pam_unix(sshd:session): session closed for user core Feb 13 10:02:56.663218 systemd[1]: sshd@29-139.178.70.43:22-139.178.68.195:42554.service: Deactivated successfully. Feb 13 10:02:56.663679 systemd[1]: session-32.scope: Deactivated successfully. Feb 13 10:02:56.664068 systemd-logind[1461]: Session 32 logged out. Waiting for processes to exit. Feb 13 10:02:56.664589 systemd-logind[1461]: Removed session 32. Feb 13 10:02:56.577000 audit[8137]: CRED_ACQ pid=8137 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.760768 kernel: audit: type=1101 audit(1707818576.576:1438): pid=8137 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.760814 kernel: audit: type=1103 audit(1707818576.577:1439): pid=8137 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.760833 kernel: audit: type=1006 audit(1707818576.577:1440): pid=8137 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=32 res=1 Feb 13 10:02:56.819411 kernel: audit: type=1300 audit(1707818576.577:1440): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef3f45ba0 a2=3 a3=0 items=0 ppid=1 pid=8137 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:56.577000 audit[8137]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef3f45ba0 a2=3 a3=0 items=0 ppid=1 pid=8137 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:02:56.854016 env[1473]: time="2024-02-13T10:02:56.853969638Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:02:56.865705 env[1473]: time="2024-02-13T10:02:56.865668400Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:02:56.865887 kubelet[2593]: E0213 10:02:56.865844 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:02:56.865887 kubelet[2593]: E0213 10:02:56.865871 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:02:56.866242 kubelet[2593]: E0213 10:02:56.865892 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:02:56.866242 kubelet[2593]: E0213 10:02:56.865909 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:02:56.577000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:56.942022 kernel: audit: type=1327 audit(1707818576.577:1440): proctitle=737368643A20636F7265205B707269765D Feb 13 10:02:56.942062 kernel: audit: type=1105 audit(1707818576.582:1441): pid=8137 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.582000 audit[8137]: USER_START pid=8137 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:57.036603 kernel: audit: type=1103 audit(1707818576.582:1442): pid=8139 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.582000 audit[8139]: CRED_ACQ pid=8139 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:57.125866 kernel: audit: type=1106 audit(1707818576.662:1443): pid=8137 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.662000 audit[8137]: USER_END pid=8137 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.662000 audit[8137]: CRED_DISP pid=8137 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:57.310808 kernel: audit: type=1104 audit(1707818576.662:1444): pid=8137 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:02:56.663000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-139.178.70.43:22-139.178.68.195:42554 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:00.855148 env[1473]: time="2024-02-13T10:03:00.855061458Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:03:00.856104 env[1473]: time="2024-02-13T10:03:00.855162640Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:03:00.905328 env[1473]: time="2024-02-13T10:03:00.905250663Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:00.905518 env[1473]: time="2024-02-13T10:03:00.905430438Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:00.905622 kubelet[2593]: E0213 10:03:00.905596 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:03:00.906015 kubelet[2593]: E0213 10:03:00.905615 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:03:00.906015 kubelet[2593]: E0213 10:03:00.905655 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:03:00.906015 kubelet[2593]: E0213 10:03:00.905658 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:03:00.906015 kubelet[2593]: E0213 10:03:00.905712 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:00.906015 kubelet[2593]: E0213 10:03:00.905717 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:00.906311 kubelet[2593]: E0213 10:03:00.905755 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:03:00.906311 kubelet[2593]: E0213 10:03:00.905767 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:03:01.671578 systemd[1]: Started sshd@30-139.178.70.43:22-139.178.68.195:42570.service. Feb 13 10:03:01.671000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-139.178.70.43:22-139.178.68.195:42570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:01.698642 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:01.698709 kernel: audit: type=1130 audit(1707818581.671:1446): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-139.178.70.43:22-139.178.68.195:42570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:01.813000 audit[8254]: USER_ACCT pid=8254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:01.814448 sshd[8254]: Accepted publickey for core from 139.178.68.195 port 42570 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:01.815640 sshd[8254]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:01.817854 systemd-logind[1461]: New session 33 of user core. Feb 13 10:03:01.818323 systemd[1]: Started session-33.scope. Feb 13 10:03:01.815000 audit[8254]: CRED_ACQ pid=8254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:01.996264 kernel: audit: type=1101 audit(1707818581.813:1447): pid=8254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:01.996300 kernel: audit: type=1103 audit(1707818581.815:1448): pid=8254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:01.996318 kernel: audit: type=1006 audit(1707818581.815:1449): pid=8254 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=33 res=1 Feb 13 10:03:02.054907 kernel: audit: type=1300 audit(1707818581.815:1449): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff1bbe1000 a2=3 a3=0 items=0 ppid=1 pid=8254 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:01.815000 audit[8254]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff1bbe1000 a2=3 a3=0 items=0 ppid=1 pid=8254 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:02.146966 kernel: audit: type=1327 audit(1707818581.815:1449): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:01.815000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:02.147187 sshd[8254]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:02.148746 systemd[1]: sshd@30-139.178.70.43:22-139.178.68.195:42570.service: Deactivated successfully. Feb 13 10:03:02.149163 systemd[1]: session-33.scope: Deactivated successfully. Feb 13 10:03:02.149551 systemd-logind[1461]: Session 33 logged out. Waiting for processes to exit. Feb 13 10:03:02.150033 systemd-logind[1461]: Removed session 33. Feb 13 10:03:02.177478 kernel: audit: type=1105 audit(1707818581.820:1450): pid=8254 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:01.820000 audit[8254]: USER_START pid=8254 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:01.820000 audit[8256]: CRED_ACQ pid=8256 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:02.361485 kernel: audit: type=1103 audit(1707818581.820:1451): pid=8256 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:02.361525 kernel: audit: type=1106 audit(1707818582.147:1452): pid=8254 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:02.147000 audit[8254]: USER_END pid=8254 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:02.457066 kernel: audit: type=1104 audit(1707818582.147:1453): pid=8254 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:02.147000 audit[8254]: CRED_DISP pid=8254 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:02.148000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-139.178.70.43:22-139.178.68.195:42570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:05.854597 env[1473]: time="2024-02-13T10:03:05.854486059Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:03:05.884036 env[1473]: time="2024-02-13T10:03:05.883959949Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:05.884156 kubelet[2593]: E0213 10:03:05.884119 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:03:05.884156 kubelet[2593]: E0213 10:03:05.884147 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:03:05.884333 kubelet[2593]: E0213 10:03:05.884169 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:05.884333 kubelet[2593]: E0213 10:03:05.884189 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:03:07.098662 systemd[1]: Started sshd@31-139.178.70.43:22-139.178.68.195:49416.service. Feb 13 10:03:07.098000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-139.178.70.43:22-139.178.68.195:49416 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:07.125699 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:07.125795 kernel: audit: type=1130 audit(1707818587.098:1455): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-139.178.70.43:22-139.178.68.195:49416 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:07.234000 audit[8308]: USER_ACCT pid=8308 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.235411 sshd[8308]: Accepted publickey for core from 139.178.68.195 port 49416 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:07.236640 sshd[8308]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:07.238956 systemd-logind[1461]: New session 34 of user core. Feb 13 10:03:07.239433 systemd[1]: Started session-34.scope. Feb 13 10:03:07.316982 sshd[8308]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:07.318451 systemd[1]: sshd@31-139.178.70.43:22-139.178.68.195:49416.service: Deactivated successfully. Feb 13 10:03:07.318881 systemd[1]: session-34.scope: Deactivated successfully. Feb 13 10:03:07.319194 systemd-logind[1461]: Session 34 logged out. Waiting for processes to exit. Feb 13 10:03:07.319755 systemd-logind[1461]: Removed session 34. Feb 13 10:03:07.236000 audit[8308]: CRED_ACQ pid=8308 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.417402 kernel: audit: type=1101 audit(1707818587.234:1456): pid=8308 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.417439 kernel: audit: type=1103 audit(1707818587.236:1457): pid=8308 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.417458 kernel: audit: type=1006 audit(1707818587.236:1458): pid=8308 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=34 res=1 Feb 13 10:03:07.475986 kernel: audit: type=1300 audit(1707818587.236:1458): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd90a4e90 a2=3 a3=0 items=0 ppid=1 pid=8308 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:07.236000 audit[8308]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd90a4e90 a2=3 a3=0 items=0 ppid=1 pid=8308 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:07.568051 kernel: audit: type=1327 audit(1707818587.236:1458): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:07.236000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:07.598562 kernel: audit: type=1105 audit(1707818587.241:1459): pid=8308 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.241000 audit[8308]: USER_START pid=8308 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.693072 kernel: audit: type=1103 audit(1707818587.241:1460): pid=8310 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.241000 audit[8310]: CRED_ACQ pid=8310 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.782411 kernel: audit: type=1106 audit(1707818587.317:1461): pid=8308 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.317000 audit[8308]: USER_END pid=8308 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.317000 audit[8308]: CRED_DISP pid=8308 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.967332 kernel: audit: type=1104 audit(1707818587.317:1462): pid=8308 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:07.318000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-139.178.70.43:22-139.178.68.195:49416 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:09.614000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:09.614000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002c10e00 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:03:09.614000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:03:09.615000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:09.615000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002c10e20 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:03:09.615000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:03:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0022ab160 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:03:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:03:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0024df4a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:03:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:03:09.855001 env[1473]: time="2024-02-13T10:03:09.854918886Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:03:09.880395 env[1473]: time="2024-02-13T10:03:09.880312254Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:09.880559 kubelet[2593]: E0213 10:03:09.880541 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:03:09.880736 kubelet[2593]: E0213 10:03:09.880578 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:03:09.880736 kubelet[2593]: E0213 10:03:09.880604 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:09.880736 kubelet[2593]: E0213 10:03:09.880624 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:03:12.326107 systemd[1]: Started sshd@32-139.178.70.43:22-139.178.68.195:49428.service. Feb 13 10:03:12.324000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-139.178.70.43:22-139.178.68.195:49428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:12.353332 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 10:03:12.353440 kernel: audit: type=1130 audit(1707818592.324:1468): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-139.178.70.43:22-139.178.68.195:49428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:12.461000 audit[8360]: USER_ACCT pid=8360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.462663 sshd[8360]: Accepted publickey for core from 139.178.68.195 port 49428 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:12.463518 sshd[8360]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:12.465715 systemd-logind[1461]: New session 35 of user core. Feb 13 10:03:12.466163 systemd[1]: Started session-35.scope. Feb 13 10:03:12.544444 sshd[8360]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:12.545958 systemd[1]: sshd@32-139.178.70.43:22-139.178.68.195:49428.service: Deactivated successfully. Feb 13 10:03:12.546388 systemd[1]: session-35.scope: Deactivated successfully. Feb 13 10:03:12.546779 systemd-logind[1461]: Session 35 logged out. Waiting for processes to exit. Feb 13 10:03:12.547257 systemd-logind[1461]: Removed session 35. Feb 13 10:03:12.461000 audit[8360]: CRED_ACQ pid=8360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.644509 kernel: audit: type=1101 audit(1707818592.461:1469): pid=8360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.644546 kernel: audit: type=1103 audit(1707818592.461:1470): pid=8360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.644565 kernel: audit: type=1006 audit(1707818592.461:1471): pid=8360 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=35 res=1 Feb 13 10:03:12.703147 kernel: audit: type=1300 audit(1707818592.461:1471): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc5c6a51c0 a2=3 a3=0 items=0 ppid=1 pid=8360 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:12.461000 audit[8360]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc5c6a51c0 a2=3 a3=0 items=0 ppid=1 pid=8360 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:12.795192 kernel: audit: type=1327 audit(1707818592.461:1471): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:12.461000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:12.825712 kernel: audit: type=1105 audit(1707818592.466:1472): pid=8360 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.466000 audit[8360]: USER_START pid=8360 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.853608 env[1473]: time="2024-02-13T10:03:12.853545797Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:03:12.865156 env[1473]: time="2024-02-13T10:03:12.865088950Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:12.865281 kubelet[2593]: E0213 10:03:12.865270 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:03:12.865453 kubelet[2593]: E0213 10:03:12.865298 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:03:12.865453 kubelet[2593]: E0213 10:03:12.865322 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:12.865453 kubelet[2593]: E0213 10:03:12.865348 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:03:12.920226 kernel: audit: type=1103 audit(1707818592.467:1473): pid=8362 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.467000 audit[8362]: CRED_ACQ pid=8362 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:13.009537 kernel: audit: type=1106 audit(1707818592.543:1474): pid=8360 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.543000 audit[8360]: USER_END pid=8360 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:13.105118 kernel: audit: type=1104 audit(1707818592.543:1475): pid=8360 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.543000 audit[8360]: CRED_DISP pid=8360 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:12.544000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-139.178.70.43:22-139.178.68.195:49428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:14.855231 env[1473]: time="2024-02-13T10:03:14.855087521Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:03:14.882281 env[1473]: time="2024-02-13T10:03:14.882247718Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:14.882441 kubelet[2593]: E0213 10:03:14.882430 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:03:14.882603 kubelet[2593]: E0213 10:03:14.882457 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:03:14.882603 kubelet[2593]: E0213 10:03:14.882480 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:14.882603 kubelet[2593]: E0213 10:03:14.882497 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:03:17.553055 systemd[1]: Started sshd@33-139.178.70.43:22-139.178.68.195:39040.service. Feb 13 10:03:17.551000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-139.178.70.43:22-139.178.68.195:39040 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:17.579390 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:17.579428 kernel: audit: type=1130 audit(1707818597.551:1477): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-139.178.70.43:22-139.178.68.195:39040 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:17.689000 audit[8442]: USER_ACCT pid=8442 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.690689 sshd[8442]: Accepted publickey for core from 139.178.68.195 port 39040 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:17.692645 sshd[8442]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:17.695004 systemd-logind[1461]: New session 36 of user core. Feb 13 10:03:17.695506 systemd[1]: Started session-36.scope. Feb 13 10:03:17.773424 sshd[8442]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:17.774919 systemd[1]: sshd@33-139.178.70.43:22-139.178.68.195:39040.service: Deactivated successfully. Feb 13 10:03:17.775353 systemd[1]: session-36.scope: Deactivated successfully. Feb 13 10:03:17.775748 systemd-logind[1461]: Session 36 logged out. Waiting for processes to exit. Feb 13 10:03:17.776249 systemd-logind[1461]: Removed session 36. Feb 13 10:03:17.691000 audit[8442]: CRED_ACQ pid=8442 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.782413 kernel: audit: type=1101 audit(1707818597.689:1478): pid=8442 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.782451 kernel: audit: type=1103 audit(1707818597.691:1479): pid=8442 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.853914 env[1473]: time="2024-02-13T10:03:17.853865332Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:03:17.867418 env[1473]: time="2024-02-13T10:03:17.867324884Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:17.867583 kubelet[2593]: E0213 10:03:17.867533 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:03:17.867583 kubelet[2593]: E0213 10:03:17.867558 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:03:17.867583 kubelet[2593]: E0213 10:03:17.867580 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:17.867819 kubelet[2593]: E0213 10:03:17.867598 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:03:17.931057 kernel: audit: type=1006 audit(1707818597.691:1480): pid=8442 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=36 res=1 Feb 13 10:03:17.931090 kernel: audit: type=1300 audit(1707818597.691:1480): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc12164b20 a2=3 a3=0 items=0 ppid=1 pid=8442 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:17.691000 audit[8442]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc12164b20 a2=3 a3=0 items=0 ppid=1 pid=8442 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:18.023045 kernel: audit: type=1327 audit(1707818597.691:1480): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:17.691000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:18.053557 kernel: audit: type=1105 audit(1707818597.696:1481): pid=8442 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.696000 audit[8442]: USER_START pid=8442 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:18.148151 kernel: audit: type=1103 audit(1707818597.696:1482): pid=8444 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.696000 audit[8444]: CRED_ACQ pid=8444 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:18.237356 kernel: audit: type=1106 audit(1707818597.772:1483): pid=8442 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.772000 audit[8442]: USER_END pid=8442 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.772000 audit[8442]: CRED_DISP pid=8442 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:18.422175 kernel: audit: type=1104 audit(1707818597.772:1484): pid=8442 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:17.773000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-139.178.70.43:22-139.178.68.195:39040 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:21.854635 env[1473]: time="2024-02-13T10:03:21.854468086Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:03:21.880437 env[1473]: time="2024-02-13T10:03:21.880312870Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:21.880650 kubelet[2593]: E0213 10:03:21.880608 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:03:21.880650 kubelet[2593]: E0213 10:03:21.880635 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:03:21.880826 kubelet[2593]: E0213 10:03:21.880657 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:21.880826 kubelet[2593]: E0213 10:03:21.880683 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:03:22.780904 systemd[1]: Started sshd@34-139.178.70.43:22-139.178.68.195:39046.service. Feb 13 10:03:22.779000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-139.178.70.43:22-139.178.68.195:39046 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:22.816531 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:22.816632 kernel: audit: type=1130 audit(1707818602.779:1486): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-139.178.70.43:22-139.178.68.195:39046 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:22.923000 audit[8523]: USER_ACCT pid=8523 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:22.924895 sshd[8523]: Accepted publickey for core from 139.178.68.195 port 39046 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:22.926670 sshd[8523]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:22.929011 systemd-logind[1461]: New session 37 of user core. Feb 13 10:03:22.929545 systemd[1]: Started session-37.scope. Feb 13 10:03:23.006141 sshd[8523]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:23.007635 systemd[1]: sshd@34-139.178.70.43:22-139.178.68.195:39046.service: Deactivated successfully. Feb 13 10:03:23.008075 systemd[1]: session-37.scope: Deactivated successfully. Feb 13 10:03:23.008358 systemd-logind[1461]: Session 37 logged out. Waiting for processes to exit. Feb 13 10:03:23.008891 systemd-logind[1461]: Removed session 37. Feb 13 10:03:22.925000 audit[8523]: CRED_ACQ pid=8523 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.106739 kernel: audit: type=1101 audit(1707818602.923:1487): pid=8523 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.106786 kernel: audit: type=1103 audit(1707818602.925:1488): pid=8523 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.106804 kernel: audit: type=1006 audit(1707818602.925:1489): pid=8523 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=37 res=1 Feb 13 10:03:22.925000 audit[8523]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffec1cb16a0 a2=3 a3=0 items=0 ppid=1 pid=8523 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:23.257426 kernel: audit: type=1300 audit(1707818602.925:1489): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffec1cb16a0 a2=3 a3=0 items=0 ppid=1 pid=8523 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:23.257511 kernel: audit: type=1327 audit(1707818602.925:1489): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:22.925000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:23.287972 kernel: audit: type=1105 audit(1707818602.930:1490): pid=8523 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:22.930000 audit[8523]: USER_START pid=8523 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:22.930000 audit[8525]: CRED_ACQ pid=8525 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.471796 kernel: audit: type=1103 audit(1707818602.930:1491): pid=8525 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.471830 kernel: audit: type=1106 audit(1707818603.005:1492): pid=8523 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.005000 audit[8523]: USER_END pid=8523 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.005000 audit[8523]: CRED_DISP pid=8523 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.656749 kernel: audit: type=1104 audit(1707818603.005:1493): pid=8523 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:23.006000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-139.178.70.43:22-139.178.68.195:39046 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:24.855070 env[1473]: time="2024-02-13T10:03:24.854959141Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:03:24.910683 env[1473]: time="2024-02-13T10:03:24.910567975Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:24.910923 kubelet[2593]: E0213 10:03:24.910885 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:03:24.911433 kubelet[2593]: E0213 10:03:24.910948 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:03:24.911433 kubelet[2593]: E0213 10:03:24.911016 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:24.911433 kubelet[2593]: E0213 10:03:24.911066 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:03:28.015219 systemd[1]: Started sshd@35-139.178.70.43:22-139.178.68.195:50126.service. Feb 13 10:03:28.013000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-139.178.70.43:22-139.178.68.195:50126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:28.042322 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:28.042379 kernel: audit: type=1130 audit(1707818608.013:1495): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-139.178.70.43:22-139.178.68.195:50126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:28.151000 audit[8578]: USER_ACCT pid=8578 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.152505 sshd[8578]: Accepted publickey for core from 139.178.68.195 port 50126 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:28.153191 sshd[8578]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:28.155702 systemd-logind[1461]: New session 38 of user core. Feb 13 10:03:28.156237 systemd[1]: Started session-38.scope. Feb 13 10:03:28.233737 sshd[8578]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:28.235155 systemd[1]: sshd@35-139.178.70.43:22-139.178.68.195:50126.service: Deactivated successfully. Feb 13 10:03:28.235588 systemd[1]: session-38.scope: Deactivated successfully. Feb 13 10:03:28.236006 systemd-logind[1461]: Session 38 logged out. Waiting for processes to exit. Feb 13 10:03:28.236540 systemd-logind[1461]: Removed session 38. Feb 13 10:03:28.151000 audit[8578]: CRED_ACQ pid=8578 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.334728 kernel: audit: type=1101 audit(1707818608.151:1496): pid=8578 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.334765 kernel: audit: type=1103 audit(1707818608.151:1497): pid=8578 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.334782 kernel: audit: type=1006 audit(1707818608.151:1498): pid=8578 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=38 res=1 Feb 13 10:03:28.393428 kernel: audit: type=1300 audit(1707818608.151:1498): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd636abdf0 a2=3 a3=0 items=0 ppid=1 pid=8578 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:28.151000 audit[8578]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd636abdf0 a2=3 a3=0 items=0 ppid=1 pid=8578 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:28.485526 kernel: audit: type=1327 audit(1707818608.151:1498): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:28.151000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:28.516065 kernel: audit: type=1105 audit(1707818608.156:1499): pid=8578 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.156000 audit[8578]: USER_START pid=8578 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.610632 kernel: audit: type=1103 audit(1707818608.157:1500): pid=8580 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.157000 audit[8580]: CRED_ACQ pid=8580 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.699997 kernel: audit: type=1106 audit(1707818608.233:1501): pid=8578 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.233000 audit[8578]: USER_END pid=8578 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.795630 kernel: audit: type=1104 audit(1707818608.233:1502): pid=8578 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.233000 audit[8578]: CRED_DISP pid=8578 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:28.233000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-139.178.70.43:22-139.178.68.195:50126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:29.854376 env[1473]: time="2024-02-13T10:03:29.854236256Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:03:29.881523 env[1473]: time="2024-02-13T10:03:29.881465236Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:29.881621 kubelet[2593]: E0213 10:03:29.881594 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:03:29.881621 kubelet[2593]: E0213 10:03:29.881617 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:03:29.881802 kubelet[2593]: E0213 10:03:29.881638 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:29.881802 kubelet[2593]: E0213 10:03:29.881655 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:03:31.855089 env[1473]: time="2024-02-13T10:03:31.854970768Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:03:31.881204 env[1473]: time="2024-02-13T10:03:31.881171378Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:31.881401 kubelet[2593]: E0213 10:03:31.881375 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:03:31.881595 kubelet[2593]: E0213 10:03:31.881427 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:03:31.881595 kubelet[2593]: E0213 10:03:31.881456 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:31.881595 kubelet[2593]: E0213 10:03:31.881480 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:03:33.243183 systemd[1]: Started sshd@36-139.178.70.43:22-139.178.68.195:50138.service. Feb 13 10:03:33.241000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-139.178.70.43:22-139.178.68.195:50138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:33.270213 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:33.270272 kernel: audit: type=1130 audit(1707818613.241:1504): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-139.178.70.43:22-139.178.68.195:50138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:33.379000 audit[8662]: USER_ACCT pid=8662 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.380818 sshd[8662]: Accepted publickey for core from 139.178.68.195 port 50138 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:33.381625 sshd[8662]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:33.384028 systemd-logind[1461]: New session 39 of user core. Feb 13 10:03:33.384625 systemd[1]: Started session-39.scope. Feb 13 10:03:33.462891 sshd[8662]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:33.464217 systemd[1]: sshd@36-139.178.70.43:22-139.178.68.195:50138.service: Deactivated successfully. Feb 13 10:03:33.464644 systemd[1]: session-39.scope: Deactivated successfully. Feb 13 10:03:33.465011 systemd-logind[1461]: Session 39 logged out. Waiting for processes to exit. Feb 13 10:03:33.465396 systemd-logind[1461]: Removed session 39. Feb 13 10:03:33.380000 audit[8662]: CRED_ACQ pid=8662 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.563014 kernel: audit: type=1101 audit(1707818613.379:1505): pid=8662 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.563059 kernel: audit: type=1103 audit(1707818613.380:1506): pid=8662 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.563076 kernel: audit: type=1006 audit(1707818613.380:1507): pid=8662 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=39 res=1 Feb 13 10:03:33.621659 kernel: audit: type=1300 audit(1707818613.380:1507): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc2c99d220 a2=3 a3=0 items=0 ppid=1 pid=8662 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:33.380000 audit[8662]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc2c99d220 a2=3 a3=0 items=0 ppid=1 pid=8662 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:33.713730 kernel: audit: type=1327 audit(1707818613.380:1507): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:33.380000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:33.744260 kernel: audit: type=1105 audit(1707818613.385:1508): pid=8662 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.385000 audit[8662]: USER_START pid=8662 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.838827 kernel: audit: type=1103 audit(1707818613.385:1509): pid=8664 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.385000 audit[8664]: CRED_ACQ pid=8664 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.928206 kernel: audit: type=1106 audit(1707818613.462:1510): pid=8662 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.462000 audit[8662]: USER_END pid=8662 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:34.023784 kernel: audit: type=1104 audit(1707818613.462:1511): pid=8662 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.462000 audit[8662]: CRED_DISP pid=8662 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:33.463000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-139.178.70.43:22-139.178.68.195:50138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:35.855018 env[1473]: time="2024-02-13T10:03:35.854885595Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:03:35.880717 env[1473]: time="2024-02-13T10:03:35.880617521Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:35.880957 kubelet[2593]: E0213 10:03:35.880907 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:03:35.881131 kubelet[2593]: E0213 10:03:35.880960 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:03:35.881131 kubelet[2593]: E0213 10:03:35.880985 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:35.881131 kubelet[2593]: E0213 10:03:35.881003 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:03:36.855179 env[1473]: time="2024-02-13T10:03:36.855098387Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:03:36.881252 env[1473]: time="2024-02-13T10:03:36.881189812Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:36.881432 kubelet[2593]: E0213 10:03:36.881410 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:03:36.881575 kubelet[2593]: E0213 10:03:36.881437 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:03:36.881575 kubelet[2593]: E0213 10:03:36.881457 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:36.881575 kubelet[2593]: E0213 10:03:36.881478 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:03:38.473905 systemd[1]: Started sshd@37-139.178.70.43:22-139.178.68.195:52190.service. Feb 13 10:03:38.472000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-139.178.70.43:22-139.178.68.195:52190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:38.501430 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:38.501528 kernel: audit: type=1130 audit(1707818618.472:1513): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-139.178.70.43:22-139.178.68.195:52190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:38.610000 audit[8745]: USER_ACCT pid=8745 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.611656 sshd[8745]: Accepted publickey for core from 139.178.68.195 port 52190 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:38.613009 sshd[8745]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:38.615438 systemd-logind[1461]: New session 40 of user core. Feb 13 10:03:38.616037 systemd[1]: Started session-40.scope. Feb 13 10:03:38.611000 audit[8745]: CRED_ACQ pid=8745 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.794911 kernel: audit: type=1101 audit(1707818618.610:1514): pid=8745 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.794956 kernel: audit: type=1103 audit(1707818618.611:1515): pid=8745 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.794975 kernel: audit: type=1006 audit(1707818618.611:1516): pid=8745 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=40 res=1 Feb 13 10:03:38.611000 audit[8745]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc4b94b7d0 a2=3 a3=0 items=0 ppid=1 pid=8745 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:38.945555 kernel: audit: type=1300 audit(1707818618.611:1516): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc4b94b7d0 a2=3 a3=0 items=0 ppid=1 pid=8745 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:38.945636 kernel: audit: type=1327 audit(1707818618.611:1516): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:38.611000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:38.945834 sshd[8745]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:38.947470 systemd[1]: sshd@37-139.178.70.43:22-139.178.68.195:52190.service: Deactivated successfully. Feb 13 10:03:38.948122 systemd[1]: session-40.scope: Deactivated successfully. Feb 13 10:03:38.948672 systemd-logind[1461]: Session 40 logged out. Waiting for processes to exit. Feb 13 10:03:38.949222 systemd-logind[1461]: Removed session 40. Feb 13 10:03:38.976144 kernel: audit: type=1105 audit(1707818618.616:1517): pid=8745 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.616000 audit[8745]: USER_START pid=8745 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:39.070694 kernel: audit: type=1103 audit(1707818618.617:1518): pid=8747 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.617000 audit[8747]: CRED_ACQ pid=8747 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:39.159920 kernel: audit: type=1106 audit(1707818618.945:1519): pid=8745 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.945000 audit[8745]: USER_END pid=8745 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:39.255473 kernel: audit: type=1104 audit(1707818618.945:1520): pid=8745 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.945000 audit[8745]: CRED_DISP pid=8745 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:38.946000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-139.178.70.43:22-139.178.68.195:52190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:43.854622 env[1473]: time="2024-02-13T10:03:43.854483484Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:03:43.880443 env[1473]: time="2024-02-13T10:03:43.880408001Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:43.880641 kubelet[2593]: E0213 10:03:43.880603 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:03:43.880641 kubelet[2593]: E0213 10:03:43.880629 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:03:43.880989 kubelet[2593]: E0213 10:03:43.880654 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:43.880989 kubelet[2593]: E0213 10:03:43.880673 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:03:43.892309 systemd[1]: Started sshd@38-139.178.70.43:22-139.178.68.195:52204.service. Feb 13 10:03:43.891000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-139.178.70.43:22-139.178.68.195:52204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:43.919085 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:43.919143 kernel: audit: type=1130 audit(1707818623.891:1522): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-139.178.70.43:22-139.178.68.195:52204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:44.028000 audit[8803]: USER_ACCT pid=8803 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.029699 sshd[8803]: Accepted publickey for core from 139.178.68.195 port 52204 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:44.031642 sshd[8803]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:44.034034 systemd-logind[1461]: New session 41 of user core. Feb 13 10:03:44.034607 systemd[1]: Started session-41.scope. Feb 13 10:03:44.115848 sshd[8803]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:44.117478 systemd[1]: sshd@38-139.178.70.43:22-139.178.68.195:52204.service: Deactivated successfully. Feb 13 10:03:44.118227 systemd[1]: session-41.scope: Deactivated successfully. Feb 13 10:03:44.118833 systemd-logind[1461]: Session 41 logged out. Waiting for processes to exit. Feb 13 10:03:44.119273 systemd-logind[1461]: Removed session 41. Feb 13 10:03:44.030000 audit[8803]: CRED_ACQ pid=8803 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.212417 kernel: audit: type=1101 audit(1707818624.028:1523): pid=8803 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.212452 kernel: audit: type=1103 audit(1707818624.030:1524): pid=8803 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.212469 kernel: audit: type=1006 audit(1707818624.030:1525): pid=8803 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=41 res=1 Feb 13 10:03:44.270941 kernel: audit: type=1300 audit(1707818624.030:1525): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5cdb7d50 a2=3 a3=0 items=0 ppid=1 pid=8803 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:44.030000 audit[8803]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5cdb7d50 a2=3 a3=0 items=0 ppid=1 pid=8803 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:44.030000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:44.393501 kernel: audit: type=1327 audit(1707818624.030:1525): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:44.393536 kernel: audit: type=1105 audit(1707818624.035:1526): pid=8803 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.035000 audit[8803]: USER_START pid=8803 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.035000 audit[8805]: CRED_ACQ pid=8805 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.577418 kernel: audit: type=1103 audit(1707818624.035:1527): pid=8805 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.577450 kernel: audit: type=1106 audit(1707818624.115:1528): pid=8803 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.115000 audit[8803]: USER_END pid=8803 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.672934 kernel: audit: type=1104 audit(1707818624.115:1529): pid=8803 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.115000 audit[8803]: CRED_DISP pid=8803 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:44.116000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-139.178.70.43:22-139.178.68.195:52204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:44.854819 env[1473]: time="2024-02-13T10:03:44.854732249Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:03:44.880879 env[1473]: time="2024-02-13T10:03:44.880840999Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:44.881064 kubelet[2593]: E0213 10:03:44.881040 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:03:44.881225 kubelet[2593]: E0213 10:03:44.881090 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:03:44.881225 kubelet[2593]: E0213 10:03:44.881111 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:44.881225 kubelet[2593]: E0213 10:03:44.881126 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:03:47.854573 env[1473]: time="2024-02-13T10:03:47.854452437Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:03:47.906785 env[1473]: time="2024-02-13T10:03:47.906679309Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:47.907024 kubelet[2593]: E0213 10:03:47.906965 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:03:47.907024 kubelet[2593]: E0213 10:03:47.907008 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:03:47.907477 kubelet[2593]: E0213 10:03:47.907058 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:47.907477 kubelet[2593]: E0213 10:03:47.907096 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:03:49.125473 systemd[1]: Started sshd@39-139.178.70.43:22-139.178.68.195:48790.service. Feb 13 10:03:49.124000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-139.178.70.43:22-139.178.68.195:48790 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:49.152503 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:49.152566 kernel: audit: type=1130 audit(1707818629.124:1531): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-139.178.70.43:22-139.178.68.195:48790 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:49.261000 audit[8886]: USER_ACCT pid=8886 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.262792 sshd[8886]: Accepted publickey for core from 139.178.68.195 port 48790 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:49.264645 sshd[8886]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:49.267063 systemd-logind[1461]: New session 42 of user core. Feb 13 10:03:49.267620 systemd[1]: Started session-42.scope. Feb 13 10:03:49.346991 sshd[8886]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:49.348314 systemd[1]: sshd@39-139.178.70.43:22-139.178.68.195:48790.service: Deactivated successfully. Feb 13 10:03:49.348770 systemd[1]: session-42.scope: Deactivated successfully. Feb 13 10:03:49.349097 systemd-logind[1461]: Session 42 logged out. Waiting for processes to exit. Feb 13 10:03:49.349842 systemd-logind[1461]: Removed session 42. Feb 13 10:03:49.263000 audit[8886]: CRED_ACQ pid=8886 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.444512 kernel: audit: type=1101 audit(1707818629.261:1532): pid=8886 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.444585 kernel: audit: type=1103 audit(1707818629.263:1533): pid=8886 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.444626 kernel: audit: type=1006 audit(1707818629.263:1534): pid=8886 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=42 res=1 Feb 13 10:03:49.263000 audit[8886]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc8e0c39b0 a2=3 a3=0 items=0 ppid=1 pid=8886 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:49.595041 kernel: audit: type=1300 audit(1707818629.263:1534): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc8e0c39b0 a2=3 a3=0 items=0 ppid=1 pid=8886 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:49.263000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:49.595344 kernel: audit: type=1327 audit(1707818629.263:1534): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:49.268000 audit[8886]: USER_START pid=8886 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.720057 kernel: audit: type=1105 audit(1707818629.268:1535): pid=8886 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.269000 audit[8888]: CRED_ACQ pid=8888 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.809291 kernel: audit: type=1103 audit(1707818629.269:1536): pid=8888 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.809342 kernel: audit: type=1106 audit(1707818629.346:1537): pid=8886 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.346000 audit[8886]: USER_END pid=8886 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.346000 audit[8886]: CRED_DISP pid=8886 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.994117 kernel: audit: type=1104 audit(1707818629.346:1538): pid=8886 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:49.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-139.178.70.43:22-139.178.68.195:48790 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:50.854859 env[1473]: time="2024-02-13T10:03:50.854733878Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:03:50.912717 env[1473]: time="2024-02-13T10:03:50.912566986Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:50.913080 kubelet[2593]: E0213 10:03:50.913006 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:03:50.913847 kubelet[2593]: E0213 10:03:50.913087 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:03:50.913847 kubelet[2593]: E0213 10:03:50.913183 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:50.913847 kubelet[2593]: E0213 10:03:50.913258 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:03:54.356605 systemd[1]: Started sshd@40-139.178.70.43:22-139.178.68.195:48792.service. Feb 13 10:03:54.355000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-139.178.70.43:22-139.178.68.195:48792 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:54.385975 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:03:54.386087 kernel: audit: type=1130 audit(1707818634.355:1540): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-139.178.70.43:22-139.178.68.195:48792 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:54.492000 audit[8945]: USER_ACCT pid=8945 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.494210 sshd[8945]: Accepted publickey for core from 139.178.68.195 port 48792 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:54.495660 sshd[8945]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:54.498001 systemd-logind[1461]: New session 43 of user core. Feb 13 10:03:54.498495 systemd[1]: Started session-43.scope. Feb 13 10:03:54.583352 sshd[8945]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:54.584570 systemd[1]: sshd@40-139.178.70.43:22-139.178.68.195:48792.service: Deactivated successfully. Feb 13 10:03:54.584994 systemd[1]: session-43.scope: Deactivated successfully. Feb 13 10:03:54.585254 systemd-logind[1461]: Session 43 logged out. Waiting for processes to exit. Feb 13 10:03:54.494000 audit[8945]: CRED_ACQ pid=8945 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.585755 systemd-logind[1461]: Removed session 43. Feb 13 10:03:54.675830 kernel: audit: type=1101 audit(1707818634.492:1541): pid=8945 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.675920 kernel: audit: type=1103 audit(1707818634.494:1542): pid=8945 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.675938 kernel: audit: type=1006 audit(1707818634.494:1543): pid=8945 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=43 res=1 Feb 13 10:03:54.734416 kernel: audit: type=1300 audit(1707818634.494:1543): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7595eb70 a2=3 a3=0 items=0 ppid=1 pid=8945 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:54.494000 audit[8945]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7595eb70 a2=3 a3=0 items=0 ppid=1 pid=8945 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:54.494000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:54.856835 kernel: audit: type=1327 audit(1707818634.494:1543): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:54.856865 kernel: audit: type=1105 audit(1707818634.499:1544): pid=8945 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.499000 audit[8945]: USER_START pid=8945 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.500000 audit[8947]: CRED_ACQ pid=8947 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:55.040543 kernel: audit: type=1103 audit(1707818634.500:1545): pid=8947 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:55.040626 kernel: audit: type=1106 audit(1707818634.582:1546): pid=8945 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.582000 audit[8945]: USER_END pid=8945 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.582000 audit[8945]: CRED_DISP pid=8945 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:55.225517 kernel: audit: type=1104 audit(1707818634.582:1547): pid=8945 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:54.583000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-139.178.70.43:22-139.178.68.195:48792 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:55.209000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:55.209000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0031364a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:03:55.209000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:03:55.209000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:55.209000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c003124930 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:03:55.209000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:03:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c001225c60 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:03:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:03:55.437000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:55.437000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00b7148a0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:03:55.437000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:03:55.440000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:55.440000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0133509f0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:03:55.440000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:03:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0005ec3c0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:03:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:03:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:03:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00d998d50 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:03:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:03:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c00b714930 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:03:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:03:55.854976 env[1473]: time="2024-02-13T10:03:55.854754959Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:03:55.880365 env[1473]: time="2024-02-13T10:03:55.880290621Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:55.880526 kubelet[2593]: E0213 10:03:55.880477 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:03:55.880526 kubelet[2593]: E0213 10:03:55.880504 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:03:55.880526 kubelet[2593]: E0213 10:03:55.880526 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:55.880766 kubelet[2593]: E0213 10:03:55.880545 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:03:57.854456 env[1473]: time="2024-02-13T10:03:57.854356555Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:03:57.880648 env[1473]: time="2024-02-13T10:03:57.880587549Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:03:57.880763 kubelet[2593]: E0213 10:03:57.880738 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:03:57.880911 kubelet[2593]: E0213 10:03:57.880764 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:03:57.880911 kubelet[2593]: E0213 10:03:57.880785 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:03:57.880911 kubelet[2593]: E0213 10:03:57.880803 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:03:59.592936 systemd[1]: Started sshd@41-139.178.70.43:22-139.178.68.195:50482.service. Feb 13 10:03:59.591000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-139.178.70.43:22-139.178.68.195:50482 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:59.620082 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 10:03:59.620159 kernel: audit: type=1130 audit(1707818639.591:1557): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-139.178.70.43:22-139.178.68.195:50482 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:03:59.728000 audit[9029]: USER_ACCT pid=9029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.729829 sshd[9029]: Accepted publickey for core from 139.178.68.195 port 50482 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:03:59.731640 sshd[9029]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:03:59.733977 systemd-logind[1461]: New session 44 of user core. Feb 13 10:03:59.734493 systemd[1]: Started session-44.scope. Feb 13 10:03:59.813254 sshd[9029]: pam_unix(sshd:session): session closed for user core Feb 13 10:03:59.814635 systemd[1]: sshd@41-139.178.70.43:22-139.178.68.195:50482.service: Deactivated successfully. Feb 13 10:03:59.815054 systemd[1]: session-44.scope: Deactivated successfully. Feb 13 10:03:59.815328 systemd-logind[1461]: Session 44 logged out. Waiting for processes to exit. Feb 13 10:03:59.815812 systemd-logind[1461]: Removed session 44. Feb 13 10:03:59.730000 audit[9029]: CRED_ACQ pid=9029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.912556 kernel: audit: type=1101 audit(1707818639.728:1558): pid=9029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.912603 kernel: audit: type=1103 audit(1707818639.730:1559): pid=9029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.912620 kernel: audit: type=1006 audit(1707818639.730:1560): pid=9029 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=44 res=1 Feb 13 10:03:59.971254 kernel: audit: type=1300 audit(1707818639.730:1560): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff50637740 a2=3 a3=0 items=0 ppid=1 pid=9029 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:03:59.730000 audit[9029]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff50637740 a2=3 a3=0 items=0 ppid=1 pid=9029 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:00.063261 kernel: audit: type=1327 audit(1707818639.730:1560): proctitle=737368643A20636F7265205B707269765D Feb 13 10:03:59.730000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:00.093756 kernel: audit: type=1105 audit(1707818639.735:1561): pid=9029 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.735000 audit[9029]: USER_START pid=9029 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:00.188231 kernel: audit: type=1103 audit(1707818639.735:1562): pid=9031 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.735000 audit[9031]: CRED_ACQ pid=9031 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:00.277424 kernel: audit: type=1106 audit(1707818639.812:1563): pid=9029 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.812000 audit[9029]: USER_END pid=9029 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:00.372970 kernel: audit: type=1104 audit(1707818639.812:1564): pid=9029 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.812000 audit[9029]: CRED_DISP pid=9029 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:03:59.813000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-139.178.70.43:22-139.178.68.195:50482 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:00.854215 env[1473]: time="2024-02-13T10:04:00.854077597Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:04:00.879026 env[1473]: time="2024-02-13T10:04:00.878959415Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:00.879207 kubelet[2593]: E0213 10:04:00.879197 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:04:00.879426 kubelet[2593]: E0213 10:04:00.879223 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:04:00.879426 kubelet[2593]: E0213 10:04:00.879244 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:00.879426 kubelet[2593]: E0213 10:04:00.879262 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:04:02.858704 env[1473]: time="2024-02-13T10:04:02.858612964Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:04:02.872829 env[1473]: time="2024-02-13T10:04:02.872765184Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:02.872942 kubelet[2593]: E0213 10:04:02.872927 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:04:02.873117 kubelet[2593]: E0213 10:04:02.872957 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:04:02.873117 kubelet[2593]: E0213 10:04:02.872983 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:02.873117 kubelet[2593]: E0213 10:04:02.873003 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:04:04.822372 systemd[1]: Started sshd@42-139.178.70.43:22-139.178.68.195:50496.service. Feb 13 10:04:04.821000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-139.178.70.43:22-139.178.68.195:50496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:04.849193 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:04.849263 kernel: audit: type=1130 audit(1707818644.821:1566): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-139.178.70.43:22-139.178.68.195:50496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:04.957000 audit[9110]: USER_ACCT pid=9110 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:04.958607 sshd[9110]: Accepted publickey for core from 139.178.68.195 port 50496 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:04.960614 sshd[9110]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:04.963025 systemd-logind[1461]: New session 45 of user core. Feb 13 10:04:04.963521 systemd[1]: Started session-45.scope. Feb 13 10:04:05.045044 sshd[9110]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:05.046523 systemd[1]: sshd@42-139.178.70.43:22-139.178.68.195:50496.service: Deactivated successfully. Feb 13 10:04:05.047014 systemd[1]: session-45.scope: Deactivated successfully. Feb 13 10:04:05.047345 systemd-logind[1461]: Session 45 logged out. Waiting for processes to exit. Feb 13 10:04:05.047888 systemd-logind[1461]: Removed session 45. Feb 13 10:04:04.959000 audit[9110]: CRED_ACQ pid=9110 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.143107 kernel: audit: type=1101 audit(1707818644.957:1567): pid=9110 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.143142 kernel: audit: type=1103 audit(1707818644.959:1568): pid=9110 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.143159 kernel: audit: type=1006 audit(1707818644.959:1569): pid=9110 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=45 res=1 Feb 13 10:04:05.201651 kernel: audit: type=1300 audit(1707818644.959:1569): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6d4c7c50 a2=3 a3=0 items=0 ppid=1 pid=9110 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:04.959000 audit[9110]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6d4c7c50 a2=3 a3=0 items=0 ppid=1 pid=9110 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:05.293634 kernel: audit: type=1327 audit(1707818644.959:1569): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:04.959000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:05.324155 kernel: audit: type=1105 audit(1707818644.964:1570): pid=9110 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:04.964000 audit[9110]: USER_START pid=9110 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.418636 kernel: audit: type=1103 audit(1707818644.964:1571): pid=9112 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:04.964000 audit[9112]: CRED_ACQ pid=9112 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.507881 kernel: audit: type=1106 audit(1707818645.044:1572): pid=9110 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.044000 audit[9110]: USER_END pid=9110 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.603348 kernel: audit: type=1104 audit(1707818645.044:1573): pid=9110 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.044000 audit[9110]: CRED_DISP pid=9110 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:05.045000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-139.178.70.43:22-139.178.68.195:50496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:07.854999 env[1473]: time="2024-02-13T10:04:07.854909318Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:04:07.906368 env[1473]: time="2024-02-13T10:04:07.906294288Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:07.906625 kubelet[2593]: E0213 10:04:07.906565 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:04:07.906625 kubelet[2593]: E0213 10:04:07.906616 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:04:07.907086 kubelet[2593]: E0213 10:04:07.906672 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:07.907086 kubelet[2593]: E0213 10:04:07.906712 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:04:09.615000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:09.615000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e8fca0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:04:09.615000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:04:09.617000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:09.617000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e8fcc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:04:09.617000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:04:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c003136f80 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:04:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:04:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c001f6a820 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:04:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:04:09.854230 env[1473]: time="2024-02-13T10:04:09.854102923Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:04:09.881168 env[1473]: time="2024-02-13T10:04:09.881037161Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:09.881317 kubelet[2593]: E0213 10:04:09.881301 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:04:09.881526 kubelet[2593]: E0213 10:04:09.881348 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:04:09.881526 kubelet[2593]: E0213 10:04:09.881397 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:09.881526 kubelet[2593]: E0213 10:04:09.881415 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:04:10.055161 systemd[1]: Started sshd@43-139.178.70.43:22-139.178.68.195:55638.service. Feb 13 10:04:10.053000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-139.178.70.43:22-139.178.68.195:55638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:10.082000 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 10:04:10.082053 kernel: audit: type=1130 audit(1707818650.053:1579): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-139.178.70.43:22-139.178.68.195:55638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:10.191000 audit[9193]: USER_ACCT pid=9193 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.192548 sshd[9193]: Accepted publickey for core from 139.178.68.195 port 55638 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:10.193654 sshd[9193]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:10.195971 systemd-logind[1461]: New session 46 of user core. Feb 13 10:04:10.196538 systemd[1]: Started session-46.scope. Feb 13 10:04:10.277514 sshd[9193]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:10.279061 systemd[1]: sshd@43-139.178.70.43:22-139.178.68.195:55638.service: Deactivated successfully. Feb 13 10:04:10.279526 systemd[1]: session-46.scope: Deactivated successfully. Feb 13 10:04:10.280004 systemd-logind[1461]: Session 46 logged out. Waiting for processes to exit. Feb 13 10:04:10.280436 systemd-logind[1461]: Removed session 46. Feb 13 10:04:10.192000 audit[9193]: CRED_ACQ pid=9193 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.376429 kernel: audit: type=1101 audit(1707818650.191:1580): pid=9193 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.376466 kernel: audit: type=1103 audit(1707818650.192:1581): pid=9193 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.376483 kernel: audit: type=1006 audit(1707818650.192:1582): pid=9193 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=46 res=1 Feb 13 10:04:10.435017 kernel: audit: type=1300 audit(1707818650.192:1582): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc84083ba0 a2=3 a3=0 items=0 ppid=1 pid=9193 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:10.192000 audit[9193]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc84083ba0 a2=3 a3=0 items=0 ppid=1 pid=9193 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:10.527030 kernel: audit: type=1327 audit(1707818650.192:1582): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:10.192000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:10.197000 audit[9193]: USER_START pid=9193 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.652048 kernel: audit: type=1105 audit(1707818650.197:1583): pid=9193 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.652078 kernel: audit: type=1103 audit(1707818650.197:1584): pid=9195 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.197000 audit[9195]: CRED_ACQ pid=9195 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.741328 kernel: audit: type=1106 audit(1707818650.276:1585): pid=9193 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.276000 audit[9193]: USER_END pid=9193 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.836865 kernel: audit: type=1104 audit(1707818650.276:1586): pid=9193 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.276000 audit[9193]: CRED_DISP pid=9193 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:10.276000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-139.178.70.43:22-139.178.68.195:55638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:15.286795 systemd[1]: Started sshd@44-139.178.70.43:22-139.178.68.195:55642.service. Feb 13 10:04:15.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-139.178.70.43:22-139.178.68.195:55642 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:15.313909 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:15.313982 kernel: audit: type=1130 audit(1707818655.285:1588): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-139.178.70.43:22-139.178.68.195:55642 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:15.424000 audit[9219]: USER_ACCT pid=9219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.425605 sshd[9219]: Accepted publickey for core from 139.178.68.195 port 55642 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:15.427631 sshd[9219]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:15.430097 systemd-logind[1461]: New session 47 of user core. Feb 13 10:04:15.430651 systemd[1]: Started session-47.scope. Feb 13 10:04:15.508917 sshd[9219]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:15.510408 systemd[1]: sshd@44-139.178.70.43:22-139.178.68.195:55642.service: Deactivated successfully. Feb 13 10:04:15.510825 systemd[1]: session-47.scope: Deactivated successfully. Feb 13 10:04:15.511195 systemd-logind[1461]: Session 47 logged out. Waiting for processes to exit. Feb 13 10:04:15.511831 systemd-logind[1461]: Removed session 47. Feb 13 10:04:15.426000 audit[9219]: CRED_ACQ pid=9219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.608492 kernel: audit: type=1101 audit(1707818655.424:1589): pid=9219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.608531 kernel: audit: type=1103 audit(1707818655.426:1590): pid=9219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.608548 kernel: audit: type=1006 audit(1707818655.426:1591): pid=9219 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=47 res=1 Feb 13 10:04:15.667100 kernel: audit: type=1300 audit(1707818655.426:1591): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe25d897a0 a2=3 a3=0 items=0 ppid=1 pid=9219 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:15.426000 audit[9219]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe25d897a0 a2=3 a3=0 items=0 ppid=1 pid=9219 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:15.759114 kernel: audit: type=1327 audit(1707818655.426:1591): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:15.426000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:15.431000 audit[9219]: USER_START pid=9219 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.853086 env[1473]: time="2024-02-13T10:04:15.853066203Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:04:15.864815 env[1473]: time="2024-02-13T10:04:15.864779821Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:15.864965 kubelet[2593]: E0213 10:04:15.864954 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:04:15.865139 kubelet[2593]: E0213 10:04:15.864982 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:04:15.865139 kubelet[2593]: E0213 10:04:15.865005 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:15.865139 kubelet[2593]: E0213 10:04:15.865022 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:04:15.884215 kernel: audit: type=1105 audit(1707818655.431:1592): pid=9219 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.884252 kernel: audit: type=1103 audit(1707818655.432:1593): pid=9221 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.432000 audit[9221]: CRED_ACQ pid=9221 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.973604 kernel: audit: type=1106 audit(1707818655.508:1594): pid=9219 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.508000 audit[9219]: USER_END pid=9219 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:16.069170 kernel: audit: type=1104 audit(1707818655.508:1595): pid=9219 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.508000 audit[9219]: CRED_DISP pid=9219 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:15.509000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-139.178.70.43:22-139.178.68.195:55642 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:16.854457 env[1473]: time="2024-02-13T10:04:16.854372733Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:04:16.880350 env[1473]: time="2024-02-13T10:04:16.880290427Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:16.880657 kubelet[2593]: E0213 10:04:16.880616 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:04:16.880657 kubelet[2593]: E0213 10:04:16.880644 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:04:16.880834 kubelet[2593]: E0213 10:04:16.880666 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:16.880834 kubelet[2593]: E0213 10:04:16.880683 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:04:20.520272 systemd[1]: Started sshd@45-139.178.70.43:22-139.178.68.195:49264.service. Feb 13 10:04:20.519000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-139.178.70.43:22-139.178.68.195:49264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:20.547698 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:20.547784 kernel: audit: type=1130 audit(1707818660.519:1597): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-139.178.70.43:22-139.178.68.195:49264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:20.656000 audit[9300]: USER_ACCT pid=9300 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.657716 sshd[9300]: Accepted publickey for core from 139.178.68.195 port 49264 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:20.658644 sshd[9300]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:20.661026 systemd-logind[1461]: New session 48 of user core. Feb 13 10:04:20.661564 systemd[1]: Started session-48.scope. Feb 13 10:04:20.741477 sshd[9300]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:20.742940 systemd[1]: sshd@45-139.178.70.43:22-139.178.68.195:49264.service: Deactivated successfully. Feb 13 10:04:20.743362 systemd[1]: session-48.scope: Deactivated successfully. Feb 13 10:04:20.743894 systemd-logind[1461]: Session 48 logged out. Waiting for processes to exit. Feb 13 10:04:20.744784 systemd-logind[1461]: Removed session 48. Feb 13 10:04:20.657000 audit[9300]: CRED_ACQ pid=9300 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.749421 kernel: audit: type=1101 audit(1707818660.656:1598): pid=9300 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.749468 kernel: audit: type=1103 audit(1707818660.657:1599): pid=9300 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.853843 env[1473]: time="2024-02-13T10:04:20.853824669Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:04:20.865454 env[1473]: time="2024-02-13T10:04:20.865369783Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:20.865565 kubelet[2593]: E0213 10:04:20.865516 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:04:20.865565 kubelet[2593]: E0213 10:04:20.865545 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:04:20.865750 kubelet[2593]: E0213 10:04:20.865568 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:20.865750 kubelet[2593]: E0213 10:04:20.865587 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:04:20.898158 kernel: audit: type=1006 audit(1707818660.657:1600): pid=9300 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=48 res=1 Feb 13 10:04:20.898195 kernel: audit: type=1300 audit(1707818660.657:1600): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcdd8285f0 a2=3 a3=0 items=0 ppid=1 pid=9300 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:20.657000 audit[9300]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcdd8285f0 a2=3 a3=0 items=0 ppid=1 pid=9300 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:20.990342 kernel: audit: type=1327 audit(1707818660.657:1600): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:20.657000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:21.020882 kernel: audit: type=1105 audit(1707818660.662:1601): pid=9300 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.662000 audit[9300]: USER_START pid=9300 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:21.115455 kernel: audit: type=1103 audit(1707818660.663:1602): pid=9302 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.663000 audit[9302]: CRED_ACQ pid=9302 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.740000 audit[9300]: USER_END pid=9300 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:21.300428 kernel: audit: type=1106 audit(1707818660.740:1603): pid=9300 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:21.300506 kernel: audit: type=1104 audit(1707818660.740:1604): pid=9300 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.740000 audit[9300]: CRED_DISP pid=9300 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:20.741000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-139.178.70.43:22-139.178.68.195:49264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:22.854116 env[1473]: time="2024-02-13T10:04:22.854013769Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:04:22.880521 env[1473]: time="2024-02-13T10:04:22.880454975Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:22.881061 kubelet[2593]: E0213 10:04:22.880672 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:04:22.881061 kubelet[2593]: E0213 10:04:22.880716 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:04:22.881061 kubelet[2593]: E0213 10:04:22.880740 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:22.881061 kubelet[2593]: E0213 10:04:22.880757 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:04:25.751139 systemd[1]: Started sshd@46-139.178.70.43:22-139.178.68.195:49276.service. Feb 13 10:04:25.749000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-139.178.70.43:22-139.178.68.195:49276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:25.777607 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:25.777655 kernel: audit: type=1130 audit(1707818665.749:1606): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-139.178.70.43:22-139.178.68.195:49276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:25.887000 audit[9381]: USER_ACCT pid=9381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:25.888796 sshd[9381]: Accepted publickey for core from 139.178.68.195 port 49276 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:25.890622 sshd[9381]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:25.892810 systemd-logind[1461]: New session 49 of user core. Feb 13 10:04:25.893229 systemd[1]: Started session-49.scope. Feb 13 10:04:25.972404 sshd[9381]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:25.973900 systemd[1]: sshd@46-139.178.70.43:22-139.178.68.195:49276.service: Deactivated successfully. Feb 13 10:04:25.974329 systemd[1]: session-49.scope: Deactivated successfully. Feb 13 10:04:25.974742 systemd-logind[1461]: Session 49 logged out. Waiting for processes to exit. Feb 13 10:04:25.975238 systemd-logind[1461]: Removed session 49. Feb 13 10:04:25.889000 audit[9381]: CRED_ACQ pid=9381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:26.070901 kernel: audit: type=1101 audit(1707818665.887:1607): pid=9381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:26.070940 kernel: audit: type=1103 audit(1707818665.889:1608): pid=9381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:26.070959 kernel: audit: type=1006 audit(1707818665.889:1609): pid=9381 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=49 res=1 Feb 13 10:04:26.129553 kernel: audit: type=1300 audit(1707818665.889:1609): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd13efeaa0 a2=3 a3=0 items=0 ppid=1 pid=9381 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:25.889000 audit[9381]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd13efeaa0 a2=3 a3=0 items=0 ppid=1 pid=9381 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:26.221657 kernel: audit: type=1327 audit(1707818665.889:1609): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:25.889000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:26.252187 kernel: audit: type=1105 audit(1707818665.893:1610): pid=9381 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:25.893000 audit[9381]: USER_START pid=9381 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:25.894000 audit[9383]: CRED_ACQ pid=9383 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:26.436176 kernel: audit: type=1103 audit(1707818665.894:1611): pid=9383 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:26.436211 kernel: audit: type=1106 audit(1707818665.971:1612): pid=9381 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:25.971000 audit[9381]: USER_END pid=9381 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:26.531779 kernel: audit: type=1104 audit(1707818665.971:1613): pid=9381 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:25.971000 audit[9381]: CRED_DISP pid=9381 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:25.972000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-139.178.70.43:22-139.178.68.195:49276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:30.855195 env[1473]: time="2024-02-13T10:04:30.855103623Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:04:30.855195 env[1473]: time="2024-02-13T10:04:30.855122123Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:04:30.882360 env[1473]: time="2024-02-13T10:04:30.882288728Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:30.882360 env[1473]: time="2024-02-13T10:04:30.882319554Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:30.882556 kubelet[2593]: E0213 10:04:30.882536 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:04:30.882719 kubelet[2593]: E0213 10:04:30.882574 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:04:30.882719 kubelet[2593]: E0213 10:04:30.882595 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:30.882719 kubelet[2593]: E0213 10:04:30.882544 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:04:30.882719 kubelet[2593]: E0213 10:04:30.882616 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:04:30.882719 kubelet[2593]: E0213 10:04:30.882633 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:04:30.882855 kubelet[2593]: E0213 10:04:30.882653 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:30.882855 kubelet[2593]: E0213 10:04:30.882668 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:04:30.981631 systemd[1]: Started sshd@47-139.178.70.43:22-139.178.68.195:59702.service. Feb 13 10:04:30.980000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-139.178.70.43:22-139.178.68.195:59702 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:31.008790 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:31.008866 kernel: audit: type=1130 audit(1707818670.980:1615): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-139.178.70.43:22-139.178.68.195:59702 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:31.118000 audit[9465]: USER_ACCT pid=9465 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.119479 sshd[9465]: Accepted publickey for core from 139.178.68.195 port 59702 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:31.120633 sshd[9465]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:31.122946 systemd-logind[1461]: New session 50 of user core. Feb 13 10:04:31.123471 systemd[1]: Started session-50.scope. Feb 13 10:04:31.203244 sshd[9465]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:31.204778 systemd[1]: sshd@47-139.178.70.43:22-139.178.68.195:59702.service: Deactivated successfully. Feb 13 10:04:31.205204 systemd[1]: session-50.scope: Deactivated successfully. Feb 13 10:04:31.205637 systemd-logind[1461]: Session 50 logged out. Waiting for processes to exit. Feb 13 10:04:31.206185 systemd-logind[1461]: Removed session 50. Feb 13 10:04:31.119000 audit[9465]: CRED_ACQ pid=9465 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.301425 kernel: audit: type=1101 audit(1707818671.118:1616): pid=9465 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.301459 kernel: audit: type=1103 audit(1707818671.119:1617): pid=9465 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.301482 kernel: audit: type=1006 audit(1707818671.119:1618): pid=9465 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=50 res=1 Feb 13 10:04:31.359998 kernel: audit: type=1300 audit(1707818671.119:1618): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc80ee6950 a2=3 a3=0 items=0 ppid=1 pid=9465 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:31.119000 audit[9465]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc80ee6950 a2=3 a3=0 items=0 ppid=1 pid=9465 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:31.452064 kernel: audit: type=1327 audit(1707818671.119:1618): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:31.119000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:31.482604 kernel: audit: type=1105 audit(1707818671.124:1619): pid=9465 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.124000 audit[9465]: USER_START pid=9465 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.577168 kernel: audit: type=1103 audit(1707818671.124:1620): pid=9467 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.124000 audit[9467]: CRED_ACQ pid=9467 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.666482 kernel: audit: type=1106 audit(1707818671.202:1621): pid=9465 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.202000 audit[9465]: USER_END pid=9465 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.762101 kernel: audit: type=1104 audit(1707818671.202:1622): pid=9465 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.202000 audit[9465]: CRED_DISP pid=9465 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:31.203000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-139.178.70.43:22-139.178.68.195:59702 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:31.853330 env[1473]: time="2024-02-13T10:04:31.853282891Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:04:31.864810 env[1473]: time="2024-02-13T10:04:31.864746059Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:31.865034 kubelet[2593]: E0213 10:04:31.864905 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:04:31.865034 kubelet[2593]: E0213 10:04:31.864934 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:04:31.865034 kubelet[2593]: E0213 10:04:31.864959 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:31.865034 kubelet[2593]: E0213 10:04:31.864978 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:04:35.854926 env[1473]: time="2024-02-13T10:04:35.854832480Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:04:35.884062 env[1473]: time="2024-02-13T10:04:35.884013104Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:35.884300 kubelet[2593]: E0213 10:04:35.884289 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:04:35.884482 kubelet[2593]: E0213 10:04:35.884317 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:04:35.884482 kubelet[2593]: E0213 10:04:35.884344 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:35.884482 kubelet[2593]: E0213 10:04:35.884383 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:04:36.212543 systemd[1]: Started sshd@48-139.178.70.43:22-139.178.68.195:58192.service. Feb 13 10:04:36.211000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-139.178.70.43:22-139.178.68.195:58192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:36.239776 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:36.239842 kernel: audit: type=1130 audit(1707818676.211:1624): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-139.178.70.43:22-139.178.68.195:58192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:36.349000 audit[9545]: USER_ACCT pid=9545 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.351205 sshd[9545]: Accepted publickey for core from 139.178.68.195 port 58192 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:36.354429 sshd[9545]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:36.364018 systemd-logind[1461]: New session 51 of user core. Feb 13 10:04:36.365991 systemd[1]: Started session-51.scope. Feb 13 10:04:36.352000 audit[9545]: CRED_ACQ pid=9545 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.450558 sshd[9545]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:36.452007 systemd[1]: sshd@48-139.178.70.43:22-139.178.68.195:58192.service: Deactivated successfully. Feb 13 10:04:36.452446 systemd[1]: session-51.scope: Deactivated successfully. Feb 13 10:04:36.452813 systemd-logind[1461]: Session 51 logged out. Waiting for processes to exit. Feb 13 10:04:36.453378 systemd-logind[1461]: Removed session 51. Feb 13 10:04:36.532934 kernel: audit: type=1101 audit(1707818676.349:1625): pid=9545 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.532971 kernel: audit: type=1103 audit(1707818676.352:1626): pid=9545 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.532985 kernel: audit: type=1006 audit(1707818676.352:1627): pid=9545 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=51 res=1 Feb 13 10:04:36.591555 kernel: audit: type=1300 audit(1707818676.352:1627): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe70bf2ec0 a2=3 a3=0 items=0 ppid=1 pid=9545 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:36.352000 audit[9545]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe70bf2ec0 a2=3 a3=0 items=0 ppid=1 pid=9545 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:36.683580 kernel: audit: type=1327 audit(1707818676.352:1627): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:36.352000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:36.714118 kernel: audit: type=1105 audit(1707818676.371:1628): pid=9545 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.371000 audit[9545]: USER_START pid=9545 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.808690 kernel: audit: type=1103 audit(1707818676.373:1629): pid=9547 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.373000 audit[9547]: CRED_ACQ pid=9547 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.449000 audit[9545]: USER_END pid=9545 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.993791 kernel: audit: type=1106 audit(1707818676.449:1630): pid=9545 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.993826 kernel: audit: type=1104 audit(1707818676.449:1631): pid=9545 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.449000 audit[9545]: CRED_DISP pid=9545 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:36.450000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-139.178.70.43:22-139.178.68.195:58192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:41.461682 systemd[1]: Started sshd@49-139.178.70.43:22-139.178.68.195:58208.service. Feb 13 10:04:41.460000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-139.178.70.43:22-139.178.68.195:58208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:41.489097 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:41.489173 kernel: audit: type=1130 audit(1707818681.460:1633): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-139.178.70.43:22-139.178.68.195:58208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:41.598000 audit[9569]: USER_ACCT pid=9569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.599605 sshd[9569]: Accepted publickey for core from 139.178.68.195 port 58208 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:41.601645 sshd[9569]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:41.603987 systemd-logind[1461]: New session 52 of user core. Feb 13 10:04:41.604445 systemd[1]: Started session-52.scope. Feb 13 10:04:41.681562 sshd[9569]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:41.683019 systemd[1]: sshd@49-139.178.70.43:22-139.178.68.195:58208.service: Deactivated successfully. Feb 13 10:04:41.683455 systemd[1]: session-52.scope: Deactivated successfully. Feb 13 10:04:41.683884 systemd-logind[1461]: Session 52 logged out. Waiting for processes to exit. Feb 13 10:04:41.684750 systemd-logind[1461]: Removed session 52. Feb 13 10:04:41.600000 audit[9569]: CRED_ACQ pid=9569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.781498 kernel: audit: type=1101 audit(1707818681.598:1634): pid=9569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.781562 kernel: audit: type=1103 audit(1707818681.600:1635): pid=9569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.781581 kernel: audit: type=1006 audit(1707818681.600:1636): pid=9569 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=52 res=1 Feb 13 10:04:41.840102 kernel: audit: type=1300 audit(1707818681.600:1636): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdf60e35d0 a2=3 a3=0 items=0 ppid=1 pid=9569 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:41.600000 audit[9569]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdf60e35d0 a2=3 a3=0 items=0 ppid=1 pid=9569 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:41.932135 kernel: audit: type=1327 audit(1707818681.600:1636): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:41.600000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:41.962644 kernel: audit: type=1105 audit(1707818681.605:1637): pid=9569 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.605000 audit[9569]: USER_START pid=9569 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:42.057185 kernel: audit: type=1103 audit(1707818681.606:1638): pid=9571 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.606000 audit[9571]: CRED_ACQ pid=9571 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:42.146424 kernel: audit: type=1106 audit(1707818681.680:1639): pid=9569 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.680000 audit[9569]: USER_END pid=9569 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:42.242037 kernel: audit: type=1104 audit(1707818681.681:1640): pid=9569 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.681000 audit[9569]: CRED_DISP pid=9569 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:41.681000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-139.178.70.43:22-139.178.68.195:58208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:42.854591 env[1473]: time="2024-02-13T10:04:42.854494669Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:04:42.881203 env[1473]: time="2024-02-13T10:04:42.881168083Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:42.881442 kubelet[2593]: E0213 10:04:42.881402 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:04:42.881442 kubelet[2593]: E0213 10:04:42.881428 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:04:42.881797 kubelet[2593]: E0213 10:04:42.881449 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:42.881797 kubelet[2593]: E0213 10:04:42.881466 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:04:43.854963 env[1473]: time="2024-02-13T10:04:43.854862353Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:04:43.881489 env[1473]: time="2024-02-13T10:04:43.881451556Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:43.881663 kubelet[2593]: E0213 10:04:43.881635 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:04:43.881663 kubelet[2593]: E0213 10:04:43.881663 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:04:43.881863 kubelet[2593]: E0213 10:04:43.881686 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:43.881863 kubelet[2593]: E0213 10:04:43.881705 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:04:44.854913 env[1473]: time="2024-02-13T10:04:44.854783767Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:04:44.881548 env[1473]: time="2024-02-13T10:04:44.881487425Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:44.881766 kubelet[2593]: E0213 10:04:44.881641 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:04:44.881766 kubelet[2593]: E0213 10:04:44.881671 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:04:44.881766 kubelet[2593]: E0213 10:04:44.881694 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:44.881766 kubelet[2593]: E0213 10:04:44.881714 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:04:46.691448 systemd[1]: Started sshd@50-139.178.70.43:22-139.178.68.195:51860.service. Feb 13 10:04:46.690000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-139.178.70.43:22-139.178.68.195:51860 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:46.718397 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:46.718522 kernel: audit: type=1130 audit(1707818686.690:1642): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-139.178.70.43:22-139.178.68.195:51860 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:46.828000 audit[9684]: USER_ACCT pid=9684 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:46.828942 sshd[9684]: Accepted publickey for core from 139.178.68.195 port 51860 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:46.829644 sshd[9684]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:46.831651 systemd-logind[1461]: New session 53 of user core. Feb 13 10:04:46.832171 systemd[1]: Started session-53.scope. Feb 13 10:04:46.911711 sshd[9684]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:46.913033 systemd[1]: sshd@50-139.178.70.43:22-139.178.68.195:51860.service: Deactivated successfully. Feb 13 10:04:46.913460 systemd[1]: session-53.scope: Deactivated successfully. Feb 13 10:04:46.913725 systemd-logind[1461]: Session 53 logged out. Waiting for processes to exit. Feb 13 10:04:46.914083 systemd-logind[1461]: Removed session 53. Feb 13 10:04:46.828000 audit[9684]: CRED_ACQ pid=9684 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:47.012644 kernel: audit: type=1101 audit(1707818686.828:1643): pid=9684 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:47.012682 kernel: audit: type=1103 audit(1707818686.828:1644): pid=9684 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:47.012701 kernel: audit: type=1006 audit(1707818686.828:1645): pid=9684 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=53 res=1 Feb 13 10:04:47.071206 kernel: audit: type=1300 audit(1707818686.828:1645): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce5ddcac0 a2=3 a3=0 items=0 ppid=1 pid=9684 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:46.828000 audit[9684]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce5ddcac0 a2=3 a3=0 items=0 ppid=1 pid=9684 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:47.163180 kernel: audit: type=1327 audit(1707818686.828:1645): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:46.828000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:47.193674 kernel: audit: type=1105 audit(1707818686.833:1646): pid=9684 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:46.833000 audit[9684]: USER_START pid=9684 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:47.288150 kernel: audit: type=1103 audit(1707818686.833:1647): pid=9686 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:46.833000 audit[9686]: CRED_ACQ pid=9686 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:47.377415 kernel: audit: type=1106 audit(1707818686.911:1648): pid=9684 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:46.911000 audit[9684]: USER_END pid=9684 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:46.911000 audit[9684]: CRED_DISP pid=9684 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:47.562176 kernel: audit: type=1104 audit(1707818686.911:1649): pid=9684 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:46.911000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-139.178.70.43:22-139.178.68.195:51860 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:49.854683 env[1473]: time="2024-02-13T10:04:49.854541692Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:04:49.883033 env[1473]: time="2024-02-13T10:04:49.882969340Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:49.883129 kubelet[2593]: E0213 10:04:49.883116 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:04:49.883289 kubelet[2593]: E0213 10:04:49.883141 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:04:49.883289 kubelet[2593]: E0213 10:04:49.883161 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:49.883289 kubelet[2593]: E0213 10:04:49.883178 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:04:51.922073 systemd[1]: Started sshd@51-139.178.70.43:22-139.178.68.195:51868.service. Feb 13 10:04:51.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-139.178.70.43:22-139.178.68.195:51868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:51.948651 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:04:51.948690 kernel: audit: type=1130 audit(1707818691.920:1651): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-139.178.70.43:22-139.178.68.195:51868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:52.059000 audit[9737]: USER_ACCT pid=9737 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.059927 sshd[9737]: Accepted publickey for core from 139.178.68.195 port 51868 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:52.061351 sshd[9737]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:52.063567 systemd-logind[1461]: New session 54 of user core. Feb 13 10:04:52.064072 systemd[1]: Started session-54.scope. Feb 13 10:04:52.150569 sshd[9737]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:52.059000 audit[9737]: CRED_ACQ pid=9737 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.151939 systemd[1]: sshd@51-139.178.70.43:22-139.178.68.195:51868.service: Deactivated successfully. Feb 13 10:04:52.152382 systemd[1]: session-54.scope: Deactivated successfully. Feb 13 10:04:52.152772 systemd-logind[1461]: Session 54 logged out. Waiting for processes to exit. Feb 13 10:04:52.153167 systemd-logind[1461]: Removed session 54. Feb 13 10:04:52.241547 kernel: audit: type=1101 audit(1707818692.059:1652): pid=9737 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.241592 kernel: audit: type=1103 audit(1707818692.059:1653): pid=9737 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.241609 kernel: audit: type=1006 audit(1707818692.059:1654): pid=9737 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=54 res=1 Feb 13 10:04:52.300064 kernel: audit: type=1300 audit(1707818692.059:1654): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdf1331ad0 a2=3 a3=0 items=0 ppid=1 pid=9737 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:52.059000 audit[9737]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdf1331ad0 a2=3 a3=0 items=0 ppid=1 pid=9737 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:52.391997 kernel: audit: type=1327 audit(1707818692.059:1654): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:52.059000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:52.422527 kernel: audit: type=1105 audit(1707818692.064:1655): pid=9737 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.064000 audit[9737]: USER_START pid=9737 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.516962 kernel: audit: type=1103 audit(1707818692.065:1656): pid=9739 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.065000 audit[9739]: CRED_ACQ pid=9739 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.606188 kernel: audit: type=1106 audit(1707818692.150:1657): pid=9737 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.150000 audit[9737]: USER_END pid=9737 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.701642 kernel: audit: type=1104 audit(1707818692.150:1658): pid=9737 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.150000 audit[9737]: CRED_DISP pid=9737 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:52.150000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-139.178.70.43:22-139.178.68.195:51868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:55.211000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:55.211000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000225fa0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:04:55.211000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:04:55.211000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:55.211000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c003125bc0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:04:55.211000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:04:55.439000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:55.439000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00fc96660 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:04:55.439000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:04:55.439000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:55.439000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c002330020 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:04:55.439000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:04:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00c0f2ae0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:04:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:04:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c002330040 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:04:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:04:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c00c387620 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:04:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:04:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:04:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=69 a1=c0130266c0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:04:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:04:55.854537 env[1473]: time="2024-02-13T10:04:55.854412218Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:04:55.904716 env[1473]: time="2024-02-13T10:04:55.904622238Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:55.904943 kubelet[2593]: E0213 10:04:55.904886 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:04:55.904943 kubelet[2593]: E0213 10:04:55.904936 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:04:55.905425 kubelet[2593]: E0213 10:04:55.904987 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:55.905425 kubelet[2593]: E0213 10:04:55.905026 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:04:56.854815 env[1473]: time="2024-02-13T10:04:56.854709900Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:04:56.880682 env[1473]: time="2024-02-13T10:04:56.880614581Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:56.880808 kubelet[2593]: E0213 10:04:56.880796 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:04:56.880858 kubelet[2593]: E0213 10:04:56.880823 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:04:56.880858 kubelet[2593]: E0213 10:04:56.880848 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:56.880931 kubelet[2593]: E0213 10:04:56.880868 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:04:57.162535 systemd[1]: Started sshd@52-139.178.70.43:22-139.178.68.195:56072.service. Feb 13 10:04:57.162000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-139.178.70.43:22-139.178.68.195:56072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:57.189979 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 10:04:57.190065 kernel: audit: type=1130 audit(1707818697.162:1668): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-139.178.70.43:22-139.178.68.195:56072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:57.300000 audit[9819]: USER_ACCT pid=9819 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.300701 sshd[9819]: Accepted publickey for core from 139.178.68.195 port 56072 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:04:57.302586 sshd[9819]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:04:57.304885 systemd-logind[1461]: New session 55 of user core. Feb 13 10:04:57.305438 systemd[1]: Started session-55.scope. Feb 13 10:04:57.386202 sshd[9819]: pam_unix(sshd:session): session closed for user core Feb 13 10:04:57.387679 systemd[1]: sshd@52-139.178.70.43:22-139.178.68.195:56072.service: Deactivated successfully. Feb 13 10:04:57.388105 systemd[1]: session-55.scope: Deactivated successfully. Feb 13 10:04:57.388429 systemd-logind[1461]: Session 55 logged out. Waiting for processes to exit. Feb 13 10:04:57.389004 systemd-logind[1461]: Removed session 55. Feb 13 10:04:57.301000 audit[9819]: CRED_ACQ pid=9819 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.484534 kernel: audit: type=1101 audit(1707818697.300:1669): pid=9819 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.484590 kernel: audit: type=1103 audit(1707818697.301:1670): pid=9819 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.484609 kernel: audit: type=1006 audit(1707818697.302:1671): pid=9819 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=55 res=1 Feb 13 10:04:57.543137 kernel: audit: type=1300 audit(1707818697.302:1671): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd8a5f5d00 a2=3 a3=0 items=0 ppid=1 pid=9819 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:57.302000 audit[9819]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd8a5f5d00 a2=3 a3=0 items=0 ppid=1 pid=9819 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:04:57.635112 kernel: audit: type=1327 audit(1707818697.302:1671): proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:57.302000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:04:57.665614 kernel: audit: type=1105 audit(1707818697.307:1672): pid=9819 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.307000 audit[9819]: USER_START pid=9819 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.760087 kernel: audit: type=1103 audit(1707818697.308:1673): pid=9821 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.308000 audit[9821]: CRED_ACQ pid=9821 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.849297 kernel: audit: type=1106 audit(1707818697.386:1674): pid=9819 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.386000 audit[9819]: USER_END pid=9819 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.944809 kernel: audit: type=1104 audit(1707818697.386:1675): pid=9819 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.386000 audit[9819]: CRED_DISP pid=9819 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:04:57.387000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-139.178.70.43:22-139.178.68.195:56072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:04:58.855571 env[1473]: time="2024-02-13T10:04:58.855484729Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:04:58.871145 env[1473]: time="2024-02-13T10:04:58.871083200Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:04:58.871246 kubelet[2593]: E0213 10:04:58.871183 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:04:58.871246 kubelet[2593]: E0213 10:04:58.871209 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:04:58.871246 kubelet[2593]: E0213 10:04:58.871232 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:04:58.871499 kubelet[2593]: E0213 10:04:58.871249 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:05:01.854726 env[1473]: time="2024-02-13T10:05:01.854590511Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:05:01.880910 env[1473]: time="2024-02-13T10:05:01.880833934Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:01.881069 kubelet[2593]: E0213 10:05:01.881049 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:05:01.881218 kubelet[2593]: E0213 10:05:01.881071 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:05:01.881218 kubelet[2593]: E0213 10:05:01.881091 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:01.881218 kubelet[2593]: E0213 10:05:01.881114 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:05:02.390106 systemd[1]: Started sshd@53-139.178.70.43:22-139.178.68.195:56082.service. Feb 13 10:05:02.389000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-139.178.70.43:22-139.178.68.195:56082 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:02.417376 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:02.417450 kernel: audit: type=1130 audit(1707818702.389:1677): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-139.178.70.43:22-139.178.68.195:56082 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:02.527000 audit[9907]: USER_ACCT pid=9907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.527485 sshd[9907]: Accepted publickey for core from 139.178.68.195 port 56082 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:02.528631 sshd[9907]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:02.531038 systemd-logind[1461]: New session 56 of user core. Feb 13 10:05:02.531526 systemd[1]: Started session-56.scope. Feb 13 10:05:02.612062 sshd[9907]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:02.613424 systemd[1]: sshd@53-139.178.70.43:22-139.178.68.195:56082.service: Deactivated successfully. Feb 13 10:05:02.613844 systemd[1]: session-56.scope: Deactivated successfully. Feb 13 10:05:02.614178 systemd-logind[1461]: Session 56 logged out. Waiting for processes to exit. Feb 13 10:05:02.614777 systemd-logind[1461]: Removed session 56. Feb 13 10:05:02.528000 audit[9907]: CRED_ACQ pid=9907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.710297 kernel: audit: type=1101 audit(1707818702.527:1678): pid=9907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.710335 kernel: audit: type=1103 audit(1707818702.528:1679): pid=9907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.710359 kernel: audit: type=1006 audit(1707818702.528:1680): pid=9907 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=56 res=1 Feb 13 10:05:02.768854 kernel: audit: type=1300 audit(1707818702.528:1680): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6c181fd0 a2=3 a3=0 items=0 ppid=1 pid=9907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:02.528000 audit[9907]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6c181fd0 a2=3 a3=0 items=0 ppid=1 pid=9907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:02.860850 kernel: audit: type=1327 audit(1707818702.528:1680): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:02.528000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:02.891406 kernel: audit: type=1105 audit(1707818702.533:1681): pid=9907 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.533000 audit[9907]: USER_START pid=9907 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.985866 kernel: audit: type=1103 audit(1707818702.534:1682): pid=9909 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.534000 audit[9909]: CRED_ACQ pid=9909 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:03.075092 kernel: audit: type=1106 audit(1707818702.612:1683): pid=9907 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.612000 audit[9907]: USER_END pid=9907 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:03.170626 kernel: audit: type=1104 audit(1707818702.612:1684): pid=9907 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.612000 audit[9907]: CRED_DISP pid=9907 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:02.613000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-139.178.70.43:22-139.178.68.195:56082 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:07.623624 systemd[1]: Started sshd@54-139.178.70.43:22-139.178.68.195:35470.service. Feb 13 10:05:07.623000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-139.178.70.43:22-139.178.68.195:35470 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:07.651169 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:07.651202 kernel: audit: type=1130 audit(1707818707.623:1686): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-139.178.70.43:22-139.178.68.195:35470 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:07.761000 audit[9933]: USER_ACCT pid=9933 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.761895 sshd[9933]: Accepted publickey for core from 139.178.68.195 port 35470 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:07.762618 sshd[9933]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:07.764958 systemd-logind[1461]: New session 57 of user core. Feb 13 10:05:07.765490 systemd[1]: Started session-57.scope. Feb 13 10:05:07.844483 sshd[9933]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:07.846322 systemd[1]: sshd@54-139.178.70.43:22-139.178.68.195:35470.service: Deactivated successfully. Feb 13 10:05:07.846664 systemd[1]: session-57.scope: Deactivated successfully. Feb 13 10:05:07.847013 systemd-logind[1461]: Session 57 logged out. Waiting for processes to exit. Feb 13 10:05:07.847567 systemd[1]: Started sshd@55-139.178.70.43:22-139.178.68.195:35476.service. Feb 13 10:05:07.848052 systemd-logind[1461]: Removed session 57. Feb 13 10:05:07.762000 audit[9933]: CRED_ACQ pid=9933 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.943710 kernel: audit: type=1101 audit(1707818707.761:1687): pid=9933 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.943749 kernel: audit: type=1103 audit(1707818707.762:1688): pid=9933 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.943766 kernel: audit: type=1006 audit(1707818707.762:1689): pid=9933 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=57 res=1 Feb 13 10:05:08.002284 kernel: audit: type=1300 audit(1707818707.762:1689): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe7d5bb490 a2=3 a3=0 items=0 ppid=1 pid=9933 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:07.762000 audit[9933]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe7d5bb490 a2=3 a3=0 items=0 ppid=1 pid=9933 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:08.022203 sshd[9958]: Accepted publickey for core from 139.178.68.195 port 35476 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:08.024189 sshd[9958]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:08.026411 systemd-logind[1461]: New session 58 of user core. Feb 13 10:05:08.026869 systemd[1]: Started session-58.scope. Feb 13 10:05:07.762000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:08.124842 kernel: audit: type=1327 audit(1707818707.762:1689): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:08.124884 kernel: audit: type=1105 audit(1707818707.767:1690): pid=9933 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.767000 audit[9933]: USER_START pid=9933 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.219404 kernel: audit: type=1103 audit(1707818707.768:1691): pid=9935 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.768000 audit[9935]: CRED_ACQ pid=9935 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.844000 audit[9933]: USER_END pid=9933 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.404311 kernel: audit: type=1106 audit(1707818707.844:1692): pid=9933 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.404354 kernel: audit: type=1104 audit(1707818707.844:1693): pid=9933 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.844000 audit[9933]: CRED_DISP pid=9933 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:07.846000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-139.178.70.43:22-139.178.68.195:35470 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:07.847000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-139.178.70.43:22-139.178.68.195:35476 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:08.021000 audit[9958]: USER_ACCT pid=9958 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.023000 audit[9958]: CRED_ACQ pid=9958 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.023000 audit[9958]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2aa84cc0 a2=3 a3=0 items=0 ppid=1 pid=9958 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:08.023000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:08.028000 audit[9958]: USER_START pid=9958 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.029000 audit[9960]: CRED_ACQ pid=9960 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.856067 env[1473]: time="2024-02-13T10:05:08.855850200Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:05:08.881675 env[1473]: time="2024-02-13T10:05:08.881614089Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:08.881861 kubelet[2593]: E0213 10:05:08.881806 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:05:08.881861 kubelet[2593]: E0213 10:05:08.881845 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:05:08.882050 kubelet[2593]: E0213 10:05:08.881865 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:08.882050 kubelet[2593]: E0213 10:05:08.881883 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:05:08.898288 sshd[9958]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:08.898000 audit[9958]: USER_END pid=9958 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.898000 audit[9958]: CRED_DISP pid=9958 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.900190 systemd[1]: sshd@55-139.178.70.43:22-139.178.68.195:35476.service: Deactivated successfully. Feb 13 10:05:08.899000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-139.178.70.43:22-139.178.68.195:35476 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:08.900585 systemd[1]: session-58.scope: Deactivated successfully. Feb 13 10:05:08.901060 systemd-logind[1461]: Session 58 logged out. Waiting for processes to exit. Feb 13 10:05:08.901692 systemd[1]: Started sshd@56-139.178.70.43:22-139.178.68.195:35484.service. Feb 13 10:05:08.901000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-139.178.70.43:22-139.178.68.195:35484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:08.902216 systemd-logind[1461]: Removed session 58. Feb 13 10:05:08.939000 audit[10009]: USER_ACCT pid=10009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.939836 sshd[10009]: Accepted publickey for core from 139.178.68.195 port 35484 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:08.940000 audit[10009]: CRED_ACQ pid=10009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.940000 audit[10009]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffdfe5da00 a2=3 a3=0 items=0 ppid=1 pid=10009 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:08.940000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:08.940853 sshd[10009]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:08.944370 systemd-logind[1461]: New session 59 of user core. Feb 13 10:05:08.945322 systemd[1]: Started session-59.scope. Feb 13 10:05:08.948000 audit[10009]: USER_START pid=10009 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:08.949000 audit[10011]: CRED_ACQ pid=10011 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:09.617000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:09.617000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006f4620 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:05:09.617000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:05:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001724380 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:05:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:05:09.620000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:09.620000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001724580 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:05:09.620000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:05:09.621000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:09.621000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0006f4640 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:05:09.621000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:05:09.854262 env[1473]: time="2024-02-13T10:05:09.854229783Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:05:09.862299 sshd[10009]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:09.862000 audit[10009]: USER_END pid=10009 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:09.863000 audit[10009]: CRED_DISP pid=10009 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:09.865572 systemd[1]: sshd@56-139.178.70.43:22-139.178.68.195:35484.service: Deactivated successfully. Feb 13 10:05:09.865000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-139.178.70.43:22-139.178.68.195:35484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:09.866170 systemd[1]: session-59.scope: Deactivated successfully. Feb 13 10:05:09.867738 systemd-logind[1461]: Session 59 logged out. Waiting for processes to exit. Feb 13 10:05:09.869184 systemd[1]: Started sshd@57-139.178.70.43:22-139.178.68.195:35498.service. Feb 13 10:05:09.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-139.178.70.43:22-139.178.68.195:35498 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:09.870156 systemd-logind[1461]: Removed session 59. Feb 13 10:05:09.878316 env[1473]: time="2024-02-13T10:05:09.878259816Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:09.878659 kubelet[2593]: E0213 10:05:09.878486 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:05:09.878659 kubelet[2593]: E0213 10:05:09.878526 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:05:09.878659 kubelet[2593]: E0213 10:05:09.878571 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:09.878659 kubelet[2593]: E0213 10:05:09.878603 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:05:09.878000 audit[10091]: NETFILTER_CFG table=filter:111 family=2 entries=24 op=nft_register_rule pid=10091 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 10:05:09.878000 audit[10091]: SYSCALL arch=c000003e syscall=46 success=yes exit=12476 a0=3 a1=7ffdb4b2b8c0 a2=0 a3=7ffdb4b2b8ac items=0 ppid=2856 pid=10091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:09.878000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 10:05:09.879000 audit[10091]: NETFILTER_CFG table=nat:112 family=2 entries=30 op=nft_register_rule pid=10091 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 10:05:09.879000 audit[10091]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffdb4b2b8c0 a2=0 a3=31030 items=0 ppid=2856 pid=10091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:09.879000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 10:05:09.912000 audit[10079]: USER_ACCT pid=10079 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:09.913232 sshd[10079]: Accepted publickey for core from 139.178.68.195 port 35498 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:09.913000 audit[10079]: CRED_ACQ pid=10079 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:09.913000 audit[10079]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffedd858710 a2=3 a3=0 items=0 ppid=1 pid=10079 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:09.913000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:09.914484 sshd[10079]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:09.918125 systemd-logind[1461]: New session 60 of user core. Feb 13 10:05:09.919261 systemd[1]: Started session-60.scope. Feb 13 10:05:09.923000 audit[10079]: USER_START pid=10079 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:09.924000 audit[10104]: CRED_ACQ pid=10104 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:09.942000 audit[10119]: NETFILTER_CFG table=filter:113 family=2 entries=36 op=nft_register_rule pid=10119 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 10:05:09.942000 audit[10119]: SYSCALL arch=c000003e syscall=46 success=yes exit=12476 a0=3 a1=7ffdea608820 a2=0 a3=7ffdea60880c items=0 ppid=2856 pid=10119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:09.942000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 10:05:09.943000 audit[10119]: NETFILTER_CFG table=nat:114 family=2 entries=30 op=nft_register_rule pid=10119 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 10:05:09.943000 audit[10119]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffdea608820 a2=0 a3=31030 items=0 ppid=2856 pid=10119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:09.943000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 10:05:10.125247 sshd[10079]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:10.125000 audit[10079]: USER_END pid=10079 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:10.125000 audit[10079]: CRED_DISP pid=10079 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:10.127018 systemd[1]: sshd@57-139.178.70.43:22-139.178.68.195:35498.service: Deactivated successfully. Feb 13 10:05:10.126000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-139.178.70.43:22-139.178.68.195:35498 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:10.127368 systemd[1]: session-60.scope: Deactivated successfully. Feb 13 10:05:10.127742 systemd-logind[1461]: Session 60 logged out. Waiting for processes to exit. Feb 13 10:05:10.128369 systemd[1]: Started sshd@58-139.178.70.43:22-139.178.68.195:35504.service. Feb 13 10:05:10.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-139.178.70.43:22-139.178.68.195:35504 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:10.128835 systemd-logind[1461]: Removed session 60. Feb 13 10:05:10.169000 audit[10140]: USER_ACCT pid=10140 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:10.170094 sshd[10140]: Accepted publickey for core from 139.178.68.195 port 35504 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:10.170000 audit[10140]: CRED_ACQ pid=10140 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:10.170000 audit[10140]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff15f27450 a2=3 a3=0 items=0 ppid=1 pid=10140 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:10.170000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:10.171706 sshd[10140]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:10.177194 systemd-logind[1461]: New session 61 of user core. Feb 13 10:05:10.178807 systemd[1]: Started session-61.scope. Feb 13 10:05:10.185000 audit[10140]: USER_START pid=10140 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:10.186000 audit[10143]: CRED_ACQ pid=10143 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:10.325482 sshd[10140]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:10.326000 audit[10140]: USER_END pid=10140 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:10.326000 audit[10140]: CRED_DISP pid=10140 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:10.327407 systemd[1]: sshd@58-139.178.70.43:22-139.178.68.195:35504.service: Deactivated successfully. Feb 13 10:05:10.327000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-139.178.70.43:22-139.178.68.195:35504 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:10.328033 systemd[1]: session-61.scope: Deactivated successfully. Feb 13 10:05:10.328539 systemd-logind[1461]: Session 61 logged out. Waiting for processes to exit. Feb 13 10:05:10.329193 systemd-logind[1461]: Removed session 61. Feb 13 10:05:10.855518 env[1473]: time="2024-02-13T10:05:10.855399899Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:05:10.871182 env[1473]: time="2024-02-13T10:05:10.871110376Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:10.871301 kubelet[2593]: E0213 10:05:10.871276 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:05:10.871475 kubelet[2593]: E0213 10:05:10.871306 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:05:10.871475 kubelet[2593]: E0213 10:05:10.871330 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:10.871475 kubelet[2593]: E0213 10:05:10.871359 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:05:15.334775 systemd[1]: Started sshd@59-139.178.70.43:22-139.178.68.195:35514.service. Feb 13 10:05:15.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-139.178.70.43:22-139.178.68.195:35514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:15.362211 kernel: kauditd_printk_skb: 69 callbacks suppressed Feb 13 10:05:15.362261 kernel: audit: type=1130 audit(1707818715.334:1739): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-139.178.70.43:22-139.178.68.195:35514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:15.382313 sshd[10194]: Accepted publickey for core from 139.178.68.195 port 35514 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:15.383620 sshd[10194]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:15.386005 systemd-logind[1461]: New session 62 of user core. Feb 13 10:05:15.386474 systemd[1]: Started session-62.scope. Feb 13 10:05:15.381000 audit[10194]: USER_ACCT pid=10194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.466280 sshd[10194]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:15.467856 systemd[1]: sshd@59-139.178.70.43:22-139.178.68.195:35514.service: Deactivated successfully. Feb 13 10:05:15.468316 systemd[1]: session-62.scope: Deactivated successfully. Feb 13 10:05:15.468798 systemd-logind[1461]: Session 62 logged out. Waiting for processes to exit. Feb 13 10:05:15.469291 systemd-logind[1461]: Removed session 62. Feb 13 10:05:15.544014 kernel: audit: type=1101 audit(1707818715.381:1740): pid=10194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.544050 kernel: audit: type=1103 audit(1707818715.383:1741): pid=10194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.383000 audit[10194]: CRED_ACQ pid=10194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.634648 kernel: audit: type=1006 audit(1707818715.383:1742): pid=10194 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=62 res=1 Feb 13 10:05:15.693321 kernel: audit: type=1300 audit(1707818715.383:1742): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc289274d0 a2=3 a3=0 items=0 ppid=1 pid=10194 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:15.383000 audit[10194]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc289274d0 a2=3 a3=0 items=0 ppid=1 pid=10194 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:15.383000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:15.815845 kernel: audit: type=1327 audit(1707818715.383:1742): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:15.815875 kernel: audit: type=1105 audit(1707818715.388:1743): pid=10194 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.388000 audit[10194]: USER_START pid=10194 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.910404 kernel: audit: type=1103 audit(1707818715.389:1744): pid=10196 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.389000 audit[10196]: CRED_ACQ pid=10196 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.999603 kernel: audit: type=1106 audit(1707818715.466:1745): pid=10194 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.466000 audit[10194]: USER_END pid=10194 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:16.095115 kernel: audit: type=1104 audit(1707818715.466:1746): pid=10194 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.466000 audit[10194]: CRED_DISP pid=10194 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:15.467000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-139.178.70.43:22-139.178.68.195:35514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:16.854605 env[1473]: time="2024-02-13T10:05:16.854527044Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:05:16.901503 env[1473]: time="2024-02-13T10:05:16.901467069Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:16.901644 kubelet[2593]: E0213 10:05:16.901633 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:05:16.901810 kubelet[2593]: E0213 10:05:16.901655 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:05:16.901810 kubelet[2593]: E0213 10:05:16.901676 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:16.901810 kubelet[2593]: E0213 10:05:16.901692 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:05:20.478014 systemd[1]: Started sshd@60-139.178.70.43:22-139.178.68.195:38024.service. Feb 13 10:05:20.478000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-139.178.70.43:22-139.178.68.195:38024 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:20.522002 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:20.522149 kernel: audit: type=1130 audit(1707818720.478:1748): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-139.178.70.43:22-139.178.68.195:38024 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:20.632000 audit[10246]: USER_ACCT pid=10246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.633053 sshd[10246]: Accepted publickey for core from 139.178.68.195 port 38024 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:20.634690 sshd[10246]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:20.636986 systemd-logind[1461]: New session 63 of user core. Feb 13 10:05:20.637493 systemd[1]: Started session-63.scope. Feb 13 10:05:20.714804 sshd[10246]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:20.716248 systemd[1]: sshd@60-139.178.70.43:22-139.178.68.195:38024.service: Deactivated successfully. Feb 13 10:05:20.716690 systemd[1]: session-63.scope: Deactivated successfully. Feb 13 10:05:20.717033 systemd-logind[1461]: Session 63 logged out. Waiting for processes to exit. Feb 13 10:05:20.717421 systemd-logind[1461]: Removed session 63. Feb 13 10:05:20.634000 audit[10246]: CRED_ACQ pid=10246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.814809 kernel: audit: type=1101 audit(1707818720.632:1749): pid=10246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.814861 kernel: audit: type=1103 audit(1707818720.634:1750): pid=10246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.814879 kernel: audit: type=1006 audit(1707818720.634:1751): pid=10246 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=63 res=1 Feb 13 10:05:20.873367 kernel: audit: type=1300 audit(1707818720.634:1751): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc87c96440 a2=3 a3=0 items=0 ppid=1 pid=10246 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:20.634000 audit[10246]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc87c96440 a2=3 a3=0 items=0 ppid=1 pid=10246 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:20.634000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:20.995760 kernel: audit: type=1327 audit(1707818720.634:1751): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:20.995796 kernel: audit: type=1105 audit(1707818720.639:1752): pid=10246 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.639000 audit[10246]: USER_START pid=10246 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:21.090171 kernel: audit: type=1103 audit(1707818720.640:1753): pid=10248 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.640000 audit[10248]: CRED_ACQ pid=10248 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:21.179282 kernel: audit: type=1106 audit(1707818720.715:1754): pid=10246 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.715000 audit[10246]: USER_END pid=10246 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.715000 audit[10246]: CRED_DISP pid=10246 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:21.363891 kernel: audit: type=1104 audit(1707818720.715:1755): pid=10246 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:20.716000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-139.178.70.43:22-139.178.68.195:38024 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:23.855150 env[1473]: time="2024-02-13T10:05:23.855041003Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:05:23.905283 env[1473]: time="2024-02-13T10:05:23.905200936Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:23.905569 kubelet[2593]: E0213 10:05:23.905514 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:05:23.905569 kubelet[2593]: E0213 10:05:23.905565 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:05:23.906045 kubelet[2593]: E0213 10:05:23.905618 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:23.906045 kubelet[2593]: E0213 10:05:23.905661 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:05:24.854667 env[1473]: time="2024-02-13T10:05:24.854570597Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:05:24.898177 env[1473]: time="2024-02-13T10:05:24.898119537Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:24.898566 kubelet[2593]: E0213 10:05:24.898384 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:05:24.898566 kubelet[2593]: E0213 10:05:24.898426 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:05:24.898566 kubelet[2593]: E0213 10:05:24.898470 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:24.898566 kubelet[2593]: E0213 10:05:24.898503 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:05:25.726132 systemd[1]: Started sshd@61-139.178.70.43:22-139.178.68.195:38026.service. Feb 13 10:05:25.726000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-139.178.70.43:22-139.178.68.195:38026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:25.768625 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:25.768746 kernel: audit: type=1130 audit(1707818725.726:1757): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-139.178.70.43:22-139.178.68.195:38026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:25.853651 env[1473]: time="2024-02-13T10:05:25.853615442Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:05:25.865976 env[1473]: time="2024-02-13T10:05:25.865915515Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:25.866125 kubelet[2593]: E0213 10:05:25.866080 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:05:25.866125 kubelet[2593]: E0213 10:05:25.866109 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:05:25.866318 kubelet[2593]: E0213 10:05:25.866132 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:25.866318 kubelet[2593]: E0213 10:05:25.866151 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:05:25.876000 audit[10331]: USER_ACCT pid=10331 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:25.876876 sshd[10331]: Accepted publickey for core from 139.178.68.195 port 38026 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:25.878673 sshd[10331]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:25.881248 systemd-logind[1461]: New session 64 of user core. Feb 13 10:05:25.881767 systemd[1]: Started session-64.scope. Feb 13 10:05:25.958639 sshd[10331]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:25.959990 systemd[1]: sshd@61-139.178.70.43:22-139.178.68.195:38026.service: Deactivated successfully. Feb 13 10:05:25.960426 systemd[1]: session-64.scope: Deactivated successfully. Feb 13 10:05:25.960741 systemd-logind[1461]: Session 64 logged out. Waiting for processes to exit. Feb 13 10:05:25.961171 systemd-logind[1461]: Removed session 64. Feb 13 10:05:25.878000 audit[10331]: CRED_ACQ pid=10331 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:26.058722 kernel: audit: type=1101 audit(1707818725.876:1758): pid=10331 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:26.058762 kernel: audit: type=1103 audit(1707818725.878:1759): pid=10331 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:26.058779 kernel: audit: type=1006 audit(1707818725.878:1760): pid=10331 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=64 res=1 Feb 13 10:05:26.117410 kernel: audit: type=1300 audit(1707818725.878:1760): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe5f4413c0 a2=3 a3=0 items=0 ppid=1 pid=10331 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:25.878000 audit[10331]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe5f4413c0 a2=3 a3=0 items=0 ppid=1 pid=10331 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:26.209372 kernel: audit: type=1327 audit(1707818725.878:1760): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:25.878000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:26.239826 kernel: audit: type=1105 audit(1707818725.883:1761): pid=10331 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:25.883000 audit[10331]: USER_START pid=10331 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:26.334239 kernel: audit: type=1103 audit(1707818725.884:1762): pid=10361 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:25.884000 audit[10361]: CRED_ACQ pid=10361 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:26.423427 kernel: audit: type=1106 audit(1707818725.958:1763): pid=10331 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:25.958000 audit[10331]: USER_END pid=10331 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:26.519048 kernel: audit: type=1104 audit(1707818725.958:1764): pid=10331 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:25.958000 audit[10331]: CRED_DISP pid=10331 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:25.959000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-139.178.70.43:22-139.178.68.195:38026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:27.854290 env[1473]: time="2024-02-13T10:05:27.854167929Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:05:27.880172 env[1473]: time="2024-02-13T10:05:27.880110220Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:27.880313 kubelet[2593]: E0213 10:05:27.880296 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:05:27.880483 kubelet[2593]: E0213 10:05:27.880324 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:05:27.880483 kubelet[2593]: E0213 10:05:27.880372 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:27.880483 kubelet[2593]: E0213 10:05:27.880391 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:05:30.967578 systemd[1]: Started sshd@62-139.178.70.43:22-139.178.68.195:41428.service. Feb 13 10:05:30.967000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-139.178.70.43:22-139.178.68.195:41428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:30.994966 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:30.995053 kernel: audit: type=1130 audit(1707818730.967:1766): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-139.178.70.43:22-139.178.68.195:41428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:31.110000 audit[10412]: USER_ACCT pid=10412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.110839 sshd[10412]: Accepted publickey for core from 139.178.68.195 port 41428 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:31.112664 sshd[10412]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:31.115153 systemd-logind[1461]: New session 65 of user core. Feb 13 10:05:31.115759 systemd[1]: Started session-65.scope. Feb 13 10:05:31.195818 sshd[10412]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:31.197345 systemd[1]: sshd@62-139.178.70.43:22-139.178.68.195:41428.service: Deactivated successfully. Feb 13 10:05:31.197775 systemd[1]: session-65.scope: Deactivated successfully. Feb 13 10:05:31.198168 systemd-logind[1461]: Session 65 logged out. Waiting for processes to exit. Feb 13 10:05:31.198752 systemd-logind[1461]: Removed session 65. Feb 13 10:05:31.112000 audit[10412]: CRED_ACQ pid=10412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.294327 kernel: audit: type=1101 audit(1707818731.110:1767): pid=10412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.294373 kernel: audit: type=1103 audit(1707818731.112:1768): pid=10412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.294393 kernel: audit: type=1006 audit(1707818731.112:1769): pid=10412 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=65 res=1 Feb 13 10:05:31.112000 audit[10412]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6910ca70 a2=3 a3=0 items=0 ppid=1 pid=10412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:31.444885 kernel: audit: type=1300 audit(1707818731.112:1769): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6910ca70 a2=3 a3=0 items=0 ppid=1 pid=10412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:31.444920 kernel: audit: type=1327 audit(1707818731.112:1769): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:31.112000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:31.475335 kernel: audit: type=1105 audit(1707818731.117:1770): pid=10412 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.117000 audit[10412]: USER_START pid=10412 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.569875 kernel: audit: type=1103 audit(1707818731.118:1771): pid=10414 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.118000 audit[10414]: CRED_ACQ pid=10414 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.196000 audit[10412]: USER_END pid=10412 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.754715 kernel: audit: type=1106 audit(1707818731.196:1772): pid=10412 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.754751 kernel: audit: type=1104 audit(1707818731.196:1773): pid=10412 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.196000 audit[10412]: CRED_DISP pid=10412 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:31.197000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-139.178.70.43:22-139.178.68.195:41428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:36.205580 systemd[1]: Started sshd@63-139.178.70.43:22-139.178.68.195:46620.service. Feb 13 10:05:36.205000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-139.178.70.43:22-139.178.68.195:46620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:36.232601 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:36.232673 kernel: audit: type=1130 audit(1707818736.205:1775): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-139.178.70.43:22-139.178.68.195:46620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:36.341000 audit[10437]: USER_ACCT pid=10437 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.341825 sshd[10437]: Accepted publickey for core from 139.178.68.195 port 46620 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:36.343660 sshd[10437]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:36.345996 systemd-logind[1461]: New session 66 of user core. Feb 13 10:05:36.346511 systemd[1]: Started session-66.scope. Feb 13 10:05:36.423238 sshd[10437]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:36.424730 systemd[1]: sshd@63-139.178.70.43:22-139.178.68.195:46620.service: Deactivated successfully. Feb 13 10:05:36.425150 systemd[1]: session-66.scope: Deactivated successfully. Feb 13 10:05:36.425549 systemd-logind[1461]: Session 66 logged out. Waiting for processes to exit. Feb 13 10:05:36.426136 systemd-logind[1461]: Removed session 66. Feb 13 10:05:36.343000 audit[10437]: CRED_ACQ pid=10437 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.524233 kernel: audit: type=1101 audit(1707818736.341:1776): pid=10437 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.524276 kernel: audit: type=1103 audit(1707818736.343:1777): pid=10437 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.524292 kernel: audit: type=1006 audit(1707818736.343:1778): pid=10437 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=66 res=1 Feb 13 10:05:36.582825 kernel: audit: type=1300 audit(1707818736.343:1778): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc71497710 a2=3 a3=0 items=0 ppid=1 pid=10437 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:36.343000 audit[10437]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc71497710 a2=3 a3=0 items=0 ppid=1 pid=10437 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:36.674760 kernel: audit: type=1327 audit(1707818736.343:1778): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:36.343000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:36.705181 kernel: audit: type=1105 audit(1707818736.348:1779): pid=10437 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.348000 audit[10437]: USER_START pid=10437 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.799553 kernel: audit: type=1103 audit(1707818736.349:1780): pid=10439 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.349000 audit[10439]: CRED_ACQ pid=10439 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.853578 env[1473]: time="2024-02-13T10:05:36.853537514Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:05:36.853792 env[1473]: time="2024-02-13T10:05:36.853756878Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:05:36.867708 env[1473]: time="2024-02-13T10:05:36.867645574Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:36.867708 env[1473]: time="2024-02-13T10:05:36.867645877Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:36.867847 kubelet[2593]: E0213 10:05:36.867833 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:05:36.868015 kubelet[2593]: E0213 10:05:36.867847 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:05:36.868015 kubelet[2593]: E0213 10:05:36.867867 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:05:36.868015 kubelet[2593]: E0213 10:05:36.867868 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:05:36.868015 kubelet[2593]: E0213 10:05:36.867890 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:36.868015 kubelet[2593]: E0213 10:05:36.867892 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:36.868153 kubelet[2593]: E0213 10:05:36.867909 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:05:36.868153 kubelet[2593]: E0213 10:05:36.867910 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:05:36.888693 kernel: audit: type=1106 audit(1707818736.423:1781): pid=10437 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.423000 audit[10437]: USER_END pid=10437 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.423000 audit[10437]: CRED_DISP pid=10437 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:37.073408 kernel: audit: type=1104 audit(1707818736.423:1782): pid=10437 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:36.424000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-139.178.70.43:22-139.178.68.195:46620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:37.854874 env[1473]: time="2024-02-13T10:05:37.854790451Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:05:37.880368 env[1473]: time="2024-02-13T10:05:37.880281311Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:37.880594 kubelet[2593]: E0213 10:05:37.880537 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:05:37.880594 kubelet[2593]: E0213 10:05:37.880578 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:05:37.880772 kubelet[2593]: E0213 10:05:37.880600 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:37.880772 kubelet[2593]: E0213 10:05:37.880617 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:05:39.854928 env[1473]: time="2024-02-13T10:05:39.854802348Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:05:39.880631 env[1473]: time="2024-02-13T10:05:39.880592563Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:39.880788 kubelet[2593]: E0213 10:05:39.880776 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:05:39.880941 kubelet[2593]: E0213 10:05:39.880802 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:05:39.880941 kubelet[2593]: E0213 10:05:39.880828 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:39.880941 kubelet[2593]: E0213 10:05:39.880844 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:05:41.431979 systemd[1]: Started sshd@64-139.178.70.43:22-139.178.68.195:46628.service. Feb 13 10:05:41.431000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-139.178.70.43:22-139.178.68.195:46628 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:41.458954 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:41.459055 kernel: audit: type=1130 audit(1707818741.431:1784): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-139.178.70.43:22-139.178.68.195:46628 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:41.598000 audit[10579]: USER_ACCT pid=10579 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.599478 sshd[10579]: Accepted publickey for core from 139.178.68.195 port 46628 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:41.602476 sshd[10579]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:41.608183 systemd-logind[1461]: New session 67 of user core. Feb 13 10:05:41.609566 systemd[1]: Started session-67.scope. Feb 13 10:05:41.601000 audit[10579]: CRED_ACQ pid=10579 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.692145 sshd[10579]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:41.693729 systemd[1]: sshd@64-139.178.70.43:22-139.178.68.195:46628.service: Deactivated successfully. Feb 13 10:05:41.694386 systemd[1]: session-67.scope: Deactivated successfully. Feb 13 10:05:41.694844 systemd-logind[1461]: Session 67 logged out. Waiting for processes to exit. Feb 13 10:05:41.695267 systemd-logind[1461]: Removed session 67. Feb 13 10:05:41.781140 kernel: audit: type=1101 audit(1707818741.598:1785): pid=10579 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.781178 kernel: audit: type=1103 audit(1707818741.601:1786): pid=10579 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.781199 kernel: audit: type=1006 audit(1707818741.601:1787): pid=10579 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=67 res=1 Feb 13 10:05:41.839771 kernel: audit: type=1300 audit(1707818741.601:1787): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd89c8b770 a2=3 a3=0 items=0 ppid=1 pid=10579 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:41.601000 audit[10579]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd89c8b770 a2=3 a3=0 items=0 ppid=1 pid=10579 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:41.931654 kernel: audit: type=1327 audit(1707818741.601:1787): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:41.601000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:41.962082 kernel: audit: type=1105 audit(1707818741.616:1788): pid=10579 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.616000 audit[10579]: USER_START pid=10579 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:42.056411 kernel: audit: type=1103 audit(1707818741.618:1789): pid=10581 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.618000 audit[10581]: CRED_ACQ pid=10581 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:42.145515 kernel: audit: type=1106 audit(1707818741.692:1790): pid=10579 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.692000 audit[10579]: USER_END pid=10579 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:42.240907 kernel: audit: type=1104 audit(1707818741.692:1791): pid=10579 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.692000 audit[10579]: CRED_DISP pid=10579 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:41.693000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-139.178.70.43:22-139.178.68.195:46628 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:46.702342 systemd[1]: Started sshd@65-139.178.70.43:22-139.178.68.195:52800.service. Feb 13 10:05:46.702000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-139.178.70.43:22-139.178.68.195:52800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:46.729154 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:46.729226 kernel: audit: type=1130 audit(1707818746.702:1793): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-139.178.70.43:22-139.178.68.195:52800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:46.838000 audit[10605]: USER_ACCT pid=10605 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:46.838824 sshd[10605]: Accepted publickey for core from 139.178.68.195 port 52800 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:46.840663 sshd[10605]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:46.843206 systemd-logind[1461]: New session 68 of user core. Feb 13 10:05:46.844281 systemd[1]: Started session-68.scope. Feb 13 10:05:46.922964 sshd[10605]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:46.924838 systemd[1]: sshd@65-139.178.70.43:22-139.178.68.195:52800.service: Deactivated successfully. Feb 13 10:05:46.925273 systemd[1]: session-68.scope: Deactivated successfully. Feb 13 10:05:46.925636 systemd-logind[1461]: Session 68 logged out. Waiting for processes to exit. Feb 13 10:05:46.926105 systemd-logind[1461]: Removed session 68. Feb 13 10:05:46.840000 audit[10605]: CRED_ACQ pid=10605 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:47.021116 kernel: audit: type=1101 audit(1707818746.838:1794): pid=10605 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:47.021153 kernel: audit: type=1103 audit(1707818746.840:1795): pid=10605 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:47.021170 kernel: audit: type=1006 audit(1707818746.840:1796): pid=10605 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=68 res=1 Feb 13 10:05:47.079695 kernel: audit: type=1300 audit(1707818746.840:1796): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6d205340 a2=3 a3=0 items=0 ppid=1 pid=10605 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:46.840000 audit[10605]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6d205340 a2=3 a3=0 items=0 ppid=1 pid=10605 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:47.171678 kernel: audit: type=1327 audit(1707818746.840:1796): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:46.840000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:46.846000 audit[10605]: USER_START pid=10605 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:47.202423 kernel: audit: type=1105 audit(1707818746.846:1797): pid=10605 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:46.847000 audit[10607]: CRED_ACQ pid=10607 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:47.385883 kernel: audit: type=1103 audit(1707818746.847:1798): pid=10607 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:47.385918 kernel: audit: type=1106 audit(1707818746.923:1799): pid=10605 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:46.923000 audit[10605]: USER_END pid=10605 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:46.923000 audit[10605]: CRED_DISP pid=10605 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:47.570578 kernel: audit: type=1104 audit(1707818746.923:1800): pid=10605 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:46.924000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-139.178.70.43:22-139.178.68.195:52800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:47.855166 env[1473]: time="2024-02-13T10:05:47.854932417Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:05:47.855166 env[1473]: time="2024-02-13T10:05:47.854932422Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:05:47.904455 env[1473]: time="2024-02-13T10:05:47.904351410Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:47.904654 kubelet[2593]: E0213 10:05:47.904607 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:05:47.905040 kubelet[2593]: E0213 10:05:47.904659 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:05:47.905040 kubelet[2593]: E0213 10:05:47.904709 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:47.905040 kubelet[2593]: E0213 10:05:47.904749 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:05:47.905316 env[1473]: time="2024-02-13T10:05:47.905116796Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:47.905410 kubelet[2593]: E0213 10:05:47.905360 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:05:47.905410 kubelet[2593]: E0213 10:05:47.905403 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:05:47.905538 kubelet[2593]: E0213 10:05:47.905449 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:47.905538 kubelet[2593]: E0213 10:05:47.905485 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:05:50.854304 env[1473]: time="2024-02-13T10:05:50.854175783Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:05:50.907112 env[1473]: time="2024-02-13T10:05:50.906984153Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:50.907535 kubelet[2593]: E0213 10:05:50.907448 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:05:50.907535 kubelet[2593]: E0213 10:05:50.907535 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:05:50.908354 kubelet[2593]: E0213 10:05:50.907630 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:50.908354 kubelet[2593]: E0213 10:05:50.907706 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:05:51.854000 env[1473]: time="2024-02-13T10:05:51.853905455Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:05:51.905918 env[1473]: time="2024-02-13T10:05:51.905818823Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:51.906363 kubelet[2593]: E0213 10:05:51.906129 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:05:51.906363 kubelet[2593]: E0213 10:05:51.906181 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:05:51.906363 kubelet[2593]: E0213 10:05:51.906233 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:51.906363 kubelet[2593]: E0213 10:05:51.906274 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:05:51.928218 systemd[1]: Started sshd@66-139.178.70.43:22-139.178.68.195:52810.service. Feb 13 10:05:51.928000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-139.178.70.43:22-139.178.68.195:52810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:51.954524 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:05:51.954650 kernel: audit: type=1130 audit(1707818751.928:1802): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-139.178.70.43:22-139.178.68.195:52810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:52.063000 audit[10752]: USER_ACCT pid=10752 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.063768 sshd[10752]: Accepted publickey for core from 139.178.68.195 port 52810 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:52.065671 sshd[10752]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:52.067995 systemd-logind[1461]: New session 69 of user core. Feb 13 10:05:52.068572 systemd[1]: Started session-69.scope. Feb 13 10:05:52.148036 sshd[10752]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:52.149400 systemd[1]: sshd@66-139.178.70.43:22-139.178.68.195:52810.service: Deactivated successfully. Feb 13 10:05:52.149836 systemd[1]: session-69.scope: Deactivated successfully. Feb 13 10:05:52.150187 systemd-logind[1461]: Session 69 logged out. Waiting for processes to exit. Feb 13 10:05:52.150752 systemd-logind[1461]: Removed session 69. Feb 13 10:05:52.065000 audit[10752]: CRED_ACQ pid=10752 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.245561 kernel: audit: type=1101 audit(1707818752.063:1803): pid=10752 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.245650 kernel: audit: type=1103 audit(1707818752.065:1804): pid=10752 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.245668 kernel: audit: type=1006 audit(1707818752.065:1805): pid=10752 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=69 res=1 Feb 13 10:05:52.304167 kernel: audit: type=1300 audit(1707818752.065:1805): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd510b1250 a2=3 a3=0 items=0 ppid=1 pid=10752 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:52.065000 audit[10752]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd510b1250 a2=3 a3=0 items=0 ppid=1 pid=10752 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:52.396153 kernel: audit: type=1327 audit(1707818752.065:1805): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:52.065000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:52.426635 kernel: audit: type=1105 audit(1707818752.070:1806): pid=10752 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.070000 audit[10752]: USER_START pid=10752 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.521042 kernel: audit: type=1103 audit(1707818752.070:1807): pid=10754 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.070000 audit[10754]: CRED_ACQ pid=10754 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.610205 kernel: audit: type=1106 audit(1707818752.147:1808): pid=10752 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.147000 audit[10752]: USER_END pid=10752 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.147000 audit[10752]: CRED_DISP pid=10752 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.794919 kernel: audit: type=1104 audit(1707818752.147:1809): pid=10752 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:52.147000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-139.178.70.43:22-139.178.68.195:52810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:55.211000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:55.211000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0017f9110 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:05:55.211000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:05:55.211000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:55.211000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c00113db20 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:05:55.211000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:05:55.439000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:55.439000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0113cbce0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:05:55.439000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:05:55.439000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:55.439000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c003abc3a0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:05:55.439000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:05:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00b1a5da0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:05:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:05:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0018b5a80 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:05:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:05:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0113cbd70 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:05:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:05:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:05:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=69 a1=c00c0f29c0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:05:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:05:57.158530 systemd[1]: Started sshd@67-139.178.70.43:22-139.178.68.195:50164.service. Feb 13 10:05:57.158000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-139.178.70.43:22-139.178.68.195:50164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:57.185773 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 10:05:57.185811 kernel: audit: type=1130 audit(1707818757.158:1819): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-139.178.70.43:22-139.178.68.195:50164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:57.296093 sshd[10778]: Accepted publickey for core from 139.178.68.195 port 50164 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:05:57.295000 audit[10778]: USER_ACCT pid=10778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.297633 sshd[10778]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:05:57.300055 systemd-logind[1461]: New session 70 of user core. Feb 13 10:05:57.300567 systemd[1]: Started session-70.scope. Feb 13 10:05:57.377201 sshd[10778]: pam_unix(sshd:session): session closed for user core Feb 13 10:05:57.378763 systemd[1]: sshd@67-139.178.70.43:22-139.178.68.195:50164.service: Deactivated successfully. Feb 13 10:05:57.379173 systemd[1]: session-70.scope: Deactivated successfully. Feb 13 10:05:57.379573 systemd-logind[1461]: Session 70 logged out. Waiting for processes to exit. Feb 13 10:05:57.380132 systemd-logind[1461]: Removed session 70. Feb 13 10:05:57.297000 audit[10778]: CRED_ACQ pid=10778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.477960 kernel: audit: type=1101 audit(1707818757.295:1820): pid=10778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.478000 kernel: audit: type=1103 audit(1707818757.297:1821): pid=10778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.478018 kernel: audit: type=1006 audit(1707818757.297:1822): pid=10778 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=70 res=1 Feb 13 10:05:57.536546 kernel: audit: type=1300 audit(1707818757.297:1822): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe5fd2b160 a2=3 a3=0 items=0 ppid=1 pid=10778 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:57.297000 audit[10778]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe5fd2b160 a2=3 a3=0 items=0 ppid=1 pid=10778 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:05:57.297000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:57.658934 kernel: audit: type=1327 audit(1707818757.297:1822): proctitle=737368643A20636F7265205B707269765D Feb 13 10:05:57.658970 kernel: audit: type=1105 audit(1707818757.302:1823): pid=10778 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.302000 audit[10778]: USER_START pid=10778 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.303000 audit[10780]: CRED_ACQ pid=10780 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.377000 audit[10778]: USER_END pid=10778 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.937977 kernel: audit: type=1103 audit(1707818757.303:1824): pid=10780 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.938013 kernel: audit: type=1106 audit(1707818757.377:1825): pid=10778 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.938032 kernel: audit: type=1104 audit(1707818757.377:1826): pid=10778 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.377000 audit[10778]: CRED_DISP pid=10778 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:05:57.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-139.178.70.43:22-139.178.68.195:50164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:05:58.853885 env[1473]: time="2024-02-13T10:05:58.853828691Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:05:58.867320 env[1473]: time="2024-02-13T10:05:58.867256430Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:05:58.867494 kubelet[2593]: E0213 10:05:58.867468 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:05:58.867494 kubelet[2593]: E0213 10:05:58.867493 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:05:58.867686 kubelet[2593]: E0213 10:05:58.867529 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:05:58.867686 kubelet[2593]: E0213 10:05:58.867547 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:06:00.855008 env[1473]: time="2024-02-13T10:06:00.854875749Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:06:00.880918 env[1473]: time="2024-02-13T10:06:00.880881791Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:00.881071 kubelet[2593]: E0213 10:06:00.881060 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:06:00.881235 kubelet[2593]: E0213 10:06:00.881088 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:06:00.881235 kubelet[2593]: E0213 10:06:00.881112 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:00.881235 kubelet[2593]: E0213 10:06:00.881130 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:06:02.380823 systemd[1]: Started sshd@68-139.178.70.43:22-139.178.68.195:50170.service. Feb 13 10:06:02.379000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-139.178.70.43:22-139.178.68.195:50170 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:02.407682 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:02.407772 kernel: audit: type=1130 audit(1707818762.379:1828): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-139.178.70.43:22-139.178.68.195:50170 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:02.516000 audit[10859]: USER_ACCT pid=10859 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.517727 sshd[10859]: Accepted publickey for core from 139.178.68.195 port 50170 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:02.520627 sshd[10859]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:02.522970 systemd-logind[1461]: New session 71 of user core. Feb 13 10:06:02.523493 systemd[1]: Started session-71.scope. Feb 13 10:06:02.519000 audit[10859]: CRED_ACQ pid=10859 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.700666 kernel: audit: type=1101 audit(1707818762.516:1829): pid=10859 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.700707 kernel: audit: type=1103 audit(1707818762.519:1830): pid=10859 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.700725 kernel: audit: type=1006 audit(1707818762.519:1831): pid=10859 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=71 res=1 Feb 13 10:06:02.759317 kernel: audit: type=1300 audit(1707818762.519:1831): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6f00ef60 a2=3 a3=0 items=0 ppid=1 pid=10859 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:02.519000 audit[10859]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6f00ef60 a2=3 a3=0 items=0 ppid=1 pid=10859 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:02.851302 kernel: audit: type=1327 audit(1707818762.519:1831): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:02.519000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:02.851535 sshd[10859]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:02.853038 systemd[1]: sshd@68-139.178.70.43:22-139.178.68.195:50170.service: Deactivated successfully. Feb 13 10:06:02.853496 systemd[1]: session-71.scope: Deactivated successfully. Feb 13 10:06:02.853917 systemd-logind[1461]: Session 71 logged out. Waiting for processes to exit. Feb 13 10:06:02.854312 systemd-logind[1461]: Removed session 71. Feb 13 10:06:02.881745 kernel: audit: type=1105 audit(1707818762.524:1832): pid=10859 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.524000 audit[10859]: USER_START pid=10859 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.976200 kernel: audit: type=1103 audit(1707818762.525:1833): pid=10861 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.525000 audit[10861]: CRED_ACQ pid=10861 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:03.065362 kernel: audit: type=1106 audit(1707818762.850:1834): pid=10859 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.850000 audit[10859]: USER_END pid=10859 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:03.160834 kernel: audit: type=1104 audit(1707818762.851:1835): pid=10859 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.851000 audit[10859]: CRED_DISP pid=10859 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:02.851000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-139.178.70.43:22-139.178.68.195:50170 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:03.854641 env[1473]: time="2024-02-13T10:06:03.854507248Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:06:03.854641 env[1473]: time="2024-02-13T10:06:03.854566654Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:06:03.881685 env[1473]: time="2024-02-13T10:06:03.881580191Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:03.881918 kubelet[2593]: E0213 10:06:03.881871 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:06:03.882121 kubelet[2593]: E0213 10:06:03.881933 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:06:03.882121 kubelet[2593]: E0213 10:06:03.881955 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:03.882121 kubelet[2593]: E0213 10:06:03.881974 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:06:03.882252 env[1473]: time="2024-02-13T10:06:03.882074119Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:03.882280 kubelet[2593]: E0213 10:06:03.882217 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:06:03.882280 kubelet[2593]: E0213 10:06:03.882233 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:06:03.882280 kubelet[2593]: E0213 10:06:03.882250 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:03.882280 kubelet[2593]: E0213 10:06:03.882265 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:06:07.801646 systemd[1]: Started sshd@69-139.178.70.43:22-139.178.68.195:35512.service. Feb 13 10:06:07.800000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-139.178.70.43:22-139.178.68.195:35512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:07.828678 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:07.828753 kernel: audit: type=1130 audit(1707818767.800:1837): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-139.178.70.43:22-139.178.68.195:35512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:07.937000 audit[10944]: USER_ACCT pid=10944 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:07.938584 sshd[10944]: Accepted publickey for core from 139.178.68.195 port 35512 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:07.941114 sshd[10944]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:07.943703 systemd-logind[1461]: New session 72 of user core. Feb 13 10:06:07.944172 systemd[1]: Started session-72.scope. Feb 13 10:06:08.024357 sshd[10944]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:08.025736 systemd[1]: sshd@69-139.178.70.43:22-139.178.68.195:35512.service: Deactivated successfully. Feb 13 10:06:08.026159 systemd[1]: session-72.scope: Deactivated successfully. Feb 13 10:06:08.026616 systemd-logind[1461]: Session 72 logged out. Waiting for processes to exit. Feb 13 10:06:08.027155 systemd-logind[1461]: Removed session 72. Feb 13 10:06:07.939000 audit[10944]: CRED_ACQ pid=10944 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.120959 kernel: audit: type=1101 audit(1707818767.937:1838): pid=10944 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.120996 kernel: audit: type=1103 audit(1707818767.939:1839): pid=10944 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.121014 kernel: audit: type=1006 audit(1707818767.939:1840): pid=10944 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=72 res=1 Feb 13 10:06:07.939000 audit[10944]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcbce1e3b0 a2=3 a3=0 items=0 ppid=1 pid=10944 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:08.271525 kernel: audit: type=1300 audit(1707818767.939:1840): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcbce1e3b0 a2=3 a3=0 items=0 ppid=1 pid=10944 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:08.271559 kernel: audit: type=1327 audit(1707818767.939:1840): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:07.939000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:08.301964 kernel: audit: type=1105 audit(1707818767.944:1841): pid=10944 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:07.944000 audit[10944]: USER_START pid=10944 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.396421 kernel: audit: type=1103 audit(1707818767.945:1842): pid=10946 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:07.945000 audit[10946]: CRED_ACQ pid=10946 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.485538 kernel: audit: type=1106 audit(1707818768.023:1843): pid=10944 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.023000 audit[10944]: USER_END pid=10944 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.581027 kernel: audit: type=1104 audit(1707818768.023:1844): pid=10944 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.023000 audit[10944]: CRED_DISP pid=10944 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:08.024000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-139.178.70.43:22-139.178.68.195:35512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:09.617000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:09.617000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00133df40 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:06:09.617000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:06:09.617000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:09.617000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c001b7db20 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:06:09.617000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:06:09.620000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:09.620000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001d21f40 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:06:09.620000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:06:09.620000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:09.620000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001b7db40 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:06:09.620000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:06:11.854465 env[1473]: time="2024-02-13T10:06:11.854376915Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:06:11.855360 env[1473]: time="2024-02-13T10:06:11.854597163Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:06:11.908527 env[1473]: time="2024-02-13T10:06:11.908440903Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:11.908527 env[1473]: time="2024-02-13T10:06:11.908457928Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:11.908770 kubelet[2593]: E0213 10:06:11.908682 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:06:11.908770 kubelet[2593]: E0213 10:06:11.908693 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:06:11.908770 kubelet[2593]: E0213 10:06:11.908724 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:06:11.908770 kubelet[2593]: E0213 10:06:11.908729 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:06:11.908770 kubelet[2593]: E0213 10:06:11.908766 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:11.909288 kubelet[2593]: E0213 10:06:11.908770 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:11.909288 kubelet[2593]: E0213 10:06:11.908798 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:06:11.909288 kubelet[2593]: E0213 10:06:11.908802 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:06:13.033314 systemd[1]: Started sshd@70-139.178.70.43:22-139.178.68.195:35516.service. Feb 13 10:06:13.032000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-139.178.70.43:22-139.178.68.195:35516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:13.060364 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 10:06:13.060415 kernel: audit: type=1130 audit(1707818773.032:1850): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-139.178.70.43:22-139.178.68.195:35516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:13.173000 audit[11032]: USER_ACCT pid=11032 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.175379 sshd[11032]: Accepted publickey for core from 139.178.68.195 port 35516 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:13.176653 sshd[11032]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:13.179178 systemd-logind[1461]: New session 73 of user core. Feb 13 10:06:13.179819 systemd[1]: Started session-73.scope. Feb 13 10:06:13.260510 sshd[11032]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:13.262054 systemd[1]: sshd@70-139.178.70.43:22-139.178.68.195:35516.service: Deactivated successfully. Feb 13 10:06:13.262520 systemd[1]: session-73.scope: Deactivated successfully. Feb 13 10:06:13.262880 systemd-logind[1461]: Session 73 logged out. Waiting for processes to exit. Feb 13 10:06:13.263288 systemd-logind[1461]: Removed session 73. Feb 13 10:06:13.175000 audit[11032]: CRED_ACQ pid=11032 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.359047 kernel: audit: type=1101 audit(1707818773.173:1851): pid=11032 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.359086 kernel: audit: type=1103 audit(1707818773.175:1852): pid=11032 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.359104 kernel: audit: type=1006 audit(1707818773.175:1853): pid=11032 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=73 res=1 Feb 13 10:06:13.417656 kernel: audit: type=1300 audit(1707818773.175:1853): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffed402cbe0 a2=3 a3=0 items=0 ppid=1 pid=11032 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:13.175000 audit[11032]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffed402cbe0 a2=3 a3=0 items=0 ppid=1 pid=11032 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:13.509554 kernel: audit: type=1327 audit(1707818773.175:1853): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:13.175000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:13.539979 kernel: audit: type=1105 audit(1707818773.180:1854): pid=11032 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.180000 audit[11032]: USER_START pid=11032 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.634357 kernel: audit: type=1103 audit(1707818773.181:1855): pid=11037 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.181000 audit[11037]: CRED_ACQ pid=11037 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.723423 kernel: audit: type=1106 audit(1707818773.259:1856): pid=11032 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.259000 audit[11032]: USER_END pid=11032 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.818814 kernel: audit: type=1104 audit(1707818773.260:1857): pid=11032 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.260000 audit[11032]: CRED_DISP pid=11032 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:13.260000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-139.178.70.43:22-139.178.68.195:35516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:17.855208 env[1473]: time="2024-02-13T10:06:17.855111047Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:06:17.906129 env[1473]: time="2024-02-13T10:06:17.906026383Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:17.906323 kubelet[2593]: E0213 10:06:17.906299 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:06:17.906743 kubelet[2593]: E0213 10:06:17.906361 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:06:17.906743 kubelet[2593]: E0213 10:06:17.906428 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:17.906743 kubelet[2593]: E0213 10:06:17.906472 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:06:18.268547 systemd[1]: Started sshd@71-139.178.70.43:22-139.178.68.195:46778.service. Feb 13 10:06:18.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-139.178.70.43:22-139.178.68.195:46778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:18.295372 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:18.295425 kernel: audit: type=1130 audit(1707818778.267:1859): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-139.178.70.43:22-139.178.68.195:46778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:18.401000 audit[11090]: USER_ACCT pid=11090 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.403225 sshd[11090]: Accepted publickey for core from 139.178.68.195 port 46778 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:18.404670 sshd[11090]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:18.407148 systemd-logind[1461]: New session 74 of user core. Feb 13 10:06:18.407686 systemd[1]: Started session-74.scope. Feb 13 10:06:18.484512 sshd[11090]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:18.485967 systemd[1]: sshd@71-139.178.70.43:22-139.178.68.195:46778.service: Deactivated successfully. Feb 13 10:06:18.486402 systemd[1]: session-74.scope: Deactivated successfully. Feb 13 10:06:18.486782 systemd-logind[1461]: Session 74 logged out. Waiting for processes to exit. Feb 13 10:06:18.487183 systemd-logind[1461]: Removed session 74. Feb 13 10:06:18.403000 audit[11090]: CRED_ACQ pid=11090 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.584814 kernel: audit: type=1101 audit(1707818778.401:1860): pid=11090 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.584853 kernel: audit: type=1103 audit(1707818778.403:1861): pid=11090 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.584873 kernel: audit: type=1006 audit(1707818778.403:1862): pid=11090 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=74 res=1 Feb 13 10:06:18.643419 kernel: audit: type=1300 audit(1707818778.403:1862): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff8b9c9a00 a2=3 a3=0 items=0 ppid=1 pid=11090 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:18.403000 audit[11090]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff8b9c9a00 a2=3 a3=0 items=0 ppid=1 pid=11090 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:18.735267 kernel: audit: type=1327 audit(1707818778.403:1862): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:18.403000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:18.765656 kernel: audit: type=1105 audit(1707818778.408:1863): pid=11090 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.408000 audit[11090]: USER_START pid=11090 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.853200 env[1473]: time="2024-02-13T10:06:18.853177362Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:06:18.408000 audit[11092]: CRED_ACQ pid=11092 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.864980 env[1473]: time="2024-02-13T10:06:18.864922747Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:18.865161 kubelet[2593]: E0213 10:06:18.865077 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:06:18.865161 kubelet[2593]: E0213 10:06:18.865105 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:06:18.865161 kubelet[2593]: E0213 10:06:18.865127 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:18.865161 kubelet[2593]: E0213 10:06:18.865144 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:06:18.949139 kernel: audit: type=1103 audit(1707818778.408:1864): pid=11092 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.949170 kernel: audit: type=1106 audit(1707818778.483:1865): pid=11090 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.483000 audit[11090]: USER_END pid=11090 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:19.044432 kernel: audit: type=1104 audit(1707818778.483:1866): pid=11090 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.483000 audit[11090]: CRED_DISP pid=11090 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:18.484000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-139.178.70.43:22-139.178.68.195:46778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:23.495701 systemd[1]: Started sshd@72-139.178.70.43:22-139.178.68.195:46782.service. Feb 13 10:06:23.494000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-139.178.70.43:22-139.178.68.195:46782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:23.523075 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:23.523170 kernel: audit: type=1130 audit(1707818783.494:1868): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-139.178.70.43:22-139.178.68.195:46782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:23.631000 audit[11142]: USER_ACCT pid=11142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.632932 sshd[11142]: Accepted publickey for core from 139.178.68.195 port 46782 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:23.633637 sshd[11142]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:23.635867 systemd-logind[1461]: New session 75 of user core. Feb 13 10:06:23.636377 systemd[1]: Started session-75.scope. Feb 13 10:06:23.714895 sshd[11142]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:23.716217 systemd[1]: sshd@72-139.178.70.43:22-139.178.68.195:46782.service: Deactivated successfully. Feb 13 10:06:23.716654 systemd[1]: session-75.scope: Deactivated successfully. Feb 13 10:06:23.717034 systemd-logind[1461]: Session 75 logged out. Waiting for processes to exit. Feb 13 10:06:23.717515 systemd-logind[1461]: Removed session 75. Feb 13 10:06:23.632000 audit[11142]: CRED_ACQ pid=11142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.814468 kernel: audit: type=1101 audit(1707818783.631:1869): pid=11142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.814543 kernel: audit: type=1103 audit(1707818783.632:1870): pid=11142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.814575 kernel: audit: type=1006 audit(1707818783.632:1871): pid=11142 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=75 res=1 Feb 13 10:06:23.632000 audit[11142]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff66fdd60 a2=3 a3=0 items=0 ppid=1 pid=11142 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:23.964909 kernel: audit: type=1300 audit(1707818783.632:1871): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff66fdd60 a2=3 a3=0 items=0 ppid=1 pid=11142 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:23.964962 kernel: audit: type=1327 audit(1707818783.632:1871): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:23.632000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:23.995370 kernel: audit: type=1105 audit(1707818783.637:1872): pid=11142 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.637000 audit[11142]: USER_START pid=11142 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.637000 audit[11144]: CRED_ACQ pid=11144 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:24.178778 kernel: audit: type=1103 audit(1707818783.637:1873): pid=11144 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:24.178853 kernel: audit: type=1106 audit(1707818783.714:1874): pid=11142 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.714000 audit[11142]: USER_END pid=11142 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:24.274284 kernel: audit: type=1104 audit(1707818783.714:1875): pid=11142 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.714000 audit[11142]: CRED_DISP pid=11142 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:23.715000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-139.178.70.43:22-139.178.68.195:46782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:26.854546 env[1473]: time="2024-02-13T10:06:26.854461913Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:06:26.855445 env[1473]: time="2024-02-13T10:06:26.854708147Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:06:26.881812 env[1473]: time="2024-02-13T10:06:26.881686805Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:26.882044 env[1473]: time="2024-02-13T10:06:26.881975354Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:26.882125 kubelet[2593]: E0213 10:06:26.882042 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:06:26.882125 kubelet[2593]: E0213 10:06:26.882120 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:06:26.882387 kubelet[2593]: E0213 10:06:26.882173 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:26.882387 kubelet[2593]: E0213 10:06:26.882178 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:06:26.882387 kubelet[2593]: E0213 10:06:26.882210 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:06:26.882387 kubelet[2593]: E0213 10:06:26.882211 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:06:26.882520 kubelet[2593]: E0213 10:06:26.882232 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:26.882520 kubelet[2593]: E0213 10:06:26.882247 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:06:28.724213 systemd[1]: Started sshd@73-139.178.70.43:22-139.178.68.195:56544.service. Feb 13 10:06:28.722000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-139.178.70.43:22-139.178.68.195:56544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:28.751157 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:28.751225 kernel: audit: type=1130 audit(1707818788.722:1877): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-139.178.70.43:22-139.178.68.195:56544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:28.860000 audit[11227]: USER_ACCT pid=11227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:28.862246 sshd[11227]: Accepted publickey for core from 139.178.68.195 port 56544 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:28.863727 sshd[11227]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:28.866062 systemd-logind[1461]: New session 76 of user core. Feb 13 10:06:28.866729 systemd[1]: Started session-76.scope. Feb 13 10:06:28.943979 sshd[11227]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:28.945436 systemd[1]: sshd@73-139.178.70.43:22-139.178.68.195:56544.service: Deactivated successfully. Feb 13 10:06:28.945861 systemd[1]: session-76.scope: Deactivated successfully. Feb 13 10:06:28.946169 systemd-logind[1461]: Session 76 logged out. Waiting for processes to exit. Feb 13 10:06:28.946641 systemd-logind[1461]: Removed session 76. Feb 13 10:06:28.862000 audit[11227]: CRED_ACQ pid=11227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:29.044029 kernel: audit: type=1101 audit(1707818788.860:1878): pid=11227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:29.044068 kernel: audit: type=1103 audit(1707818788.862:1879): pid=11227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:29.044088 kernel: audit: type=1006 audit(1707818788.862:1880): pid=11227 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=76 res=1 Feb 13 10:06:29.102595 kernel: audit: type=1300 audit(1707818788.862:1880): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd99201f0 a2=3 a3=0 items=0 ppid=1 pid=11227 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:28.862000 audit[11227]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd99201f0 a2=3 a3=0 items=0 ppid=1 pid=11227 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:29.194475 kernel: audit: type=1327 audit(1707818788.862:1880): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:28.862000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:29.224911 kernel: audit: type=1105 audit(1707818788.867:1881): pid=11227 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:28.867000 audit[11227]: USER_START pid=11227 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:29.319390 kernel: audit: type=1103 audit(1707818788.867:1882): pid=11229 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:28.867000 audit[11229]: CRED_ACQ pid=11229 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:29.408492 kernel: audit: type=1106 audit(1707818788.943:1883): pid=11227 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:28.943000 audit[11227]: USER_END pid=11227 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:29.503886 kernel: audit: type=1104 audit(1707818788.943:1884): pid=11227 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:28.943000 audit[11227]: CRED_DISP pid=11227 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:28.944000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-139.178.70.43:22-139.178.68.195:56544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:30.855367 env[1473]: time="2024-02-13T10:06:30.855222659Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:06:30.883575 env[1473]: time="2024-02-13T10:06:30.883513294Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:30.883739 kubelet[2593]: E0213 10:06:30.883698 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:06:30.883739 kubelet[2593]: E0213 10:06:30.883724 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:06:30.883932 kubelet[2593]: E0213 10:06:30.883743 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:30.883932 kubelet[2593]: E0213 10:06:30.883761 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:06:31.854925 env[1473]: time="2024-02-13T10:06:31.854796877Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:06:31.880919 env[1473]: time="2024-02-13T10:06:31.880822199Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:31.881226 kubelet[2593]: E0213 10:06:31.881096 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:06:31.881226 kubelet[2593]: E0213 10:06:31.881128 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:06:31.881226 kubelet[2593]: E0213 10:06:31.881162 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:31.881226 kubelet[2593]: E0213 10:06:31.881189 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:06:33.953561 systemd[1]: Started sshd@74-139.178.70.43:22-139.178.68.195:56550.service. Feb 13 10:06:33.952000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-139.178.70.43:22-139.178.68.195:56550 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:33.980520 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:33.980569 kernel: audit: type=1130 audit(1707818793.952:1886): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-139.178.70.43:22-139.178.68.195:56550 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:34.088000 audit[11311]: USER_ACCT pid=11311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.089650 sshd[11311]: Accepted publickey for core from 139.178.68.195 port 56550 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:34.090446 sshd[11311]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:34.092355 systemd-logind[1461]: New session 77 of user core. Feb 13 10:06:34.092984 systemd[1]: Started session-77.scope. Feb 13 10:06:34.173612 sshd[11311]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:34.175074 systemd[1]: sshd@74-139.178.70.43:22-139.178.68.195:56550.service: Deactivated successfully. Feb 13 10:06:34.175551 systemd[1]: session-77.scope: Deactivated successfully. Feb 13 10:06:34.176045 systemd-logind[1461]: Session 77 logged out. Waiting for processes to exit. Feb 13 10:06:34.176684 systemd-logind[1461]: Removed session 77. Feb 13 10:06:34.088000 audit[11311]: CRED_ACQ pid=11311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.273420 kernel: audit: type=1101 audit(1707818794.088:1887): pid=11311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.273466 kernel: audit: type=1103 audit(1707818794.088:1888): pid=11311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.273482 kernel: audit: type=1006 audit(1707818794.088:1889): pid=11311 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=77 res=1 Feb 13 10:06:34.088000 audit[11311]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffcf472670 a2=3 a3=0 items=0 ppid=1 pid=11311 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:34.423939 kernel: audit: type=1300 audit(1707818794.088:1889): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffcf472670 a2=3 a3=0 items=0 ppid=1 pid=11311 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:34.423968 kernel: audit: type=1327 audit(1707818794.088:1889): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:34.088000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:34.454377 kernel: audit: type=1105 audit(1707818794.093:1890): pid=11311 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.093000 audit[11311]: USER_START pid=11311 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.548751 kernel: audit: type=1103 audit(1707818794.094:1891): pid=11313 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.094000 audit[11313]: CRED_ACQ pid=11313 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.637876 kernel: audit: type=1106 audit(1707818794.172:1892): pid=11311 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.172000 audit[11311]: USER_END pid=11311 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.733281 kernel: audit: type=1104 audit(1707818794.172:1893): pid=11311 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.172000 audit[11311]: CRED_DISP pid=11311 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:34.173000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-139.178.70.43:22-139.178.68.195:56550 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:37.854122 env[1473]: time="2024-02-13T10:06:37.854029854Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:06:37.854122 env[1473]: time="2024-02-13T10:06:37.854073223Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:06:37.884054 env[1473]: time="2024-02-13T10:06:37.883984463Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:37.884216 env[1473]: time="2024-02-13T10:06:37.884063186Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:37.884247 kubelet[2593]: E0213 10:06:37.884206 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:06:37.884247 kubelet[2593]: E0213 10:06:37.884237 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:06:37.884475 kubelet[2593]: E0213 10:06:37.884258 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:37.884475 kubelet[2593]: E0213 10:06:37.884276 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:06:37.884475 kubelet[2593]: E0213 10:06:37.884207 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:06:37.884475 kubelet[2593]: E0213 10:06:37.884293 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:06:37.884593 kubelet[2593]: E0213 10:06:37.884314 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:37.884593 kubelet[2593]: E0213 10:06:37.884328 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:06:39.183075 systemd[1]: Started sshd@75-139.178.70.43:22-139.178.68.195:36486.service. Feb 13 10:06:39.181000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-139.178.70.43:22-139.178.68.195:36486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:39.210155 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:39.210205 kernel: audit: type=1130 audit(1707818799.181:1895): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-139.178.70.43:22-139.178.68.195:36486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:39.319000 audit[11391]: USER_ACCT pid=11391 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.320645 sshd[11391]: Accepted publickey for core from 139.178.68.195 port 36486 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:39.321746 sshd[11391]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:39.323963 systemd-logind[1461]: New session 78 of user core. Feb 13 10:06:39.324603 systemd[1]: Started session-78.scope. Feb 13 10:06:39.405065 sshd[11391]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:39.406566 systemd[1]: sshd@75-139.178.70.43:22-139.178.68.195:36486.service: Deactivated successfully. Feb 13 10:06:39.407023 systemd[1]: session-78.scope: Deactivated successfully. Feb 13 10:06:39.407311 systemd-logind[1461]: Session 78 logged out. Waiting for processes to exit. Feb 13 10:06:39.407908 systemd-logind[1461]: Removed session 78. Feb 13 10:06:39.320000 audit[11391]: CRED_ACQ pid=11391 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.503499 kernel: audit: type=1101 audit(1707818799.319:1896): pid=11391 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.503543 kernel: audit: type=1103 audit(1707818799.320:1897): pid=11391 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.503561 kernel: audit: type=1006 audit(1707818799.320:1898): pid=11391 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=78 res=1 Feb 13 10:06:39.562068 kernel: audit: type=1300 audit(1707818799.320:1898): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffefd427760 a2=3 a3=0 items=0 ppid=1 pid=11391 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:39.320000 audit[11391]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffefd427760 a2=3 a3=0 items=0 ppid=1 pid=11391 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:39.320000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:39.684373 kernel: audit: type=1327 audit(1707818799.320:1898): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:39.684404 kernel: audit: type=1105 audit(1707818799.325:1899): pid=11391 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.325000 audit[11391]: USER_START pid=11391 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.778734 kernel: audit: type=1103 audit(1707818799.326:1900): pid=11393 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.326000 audit[11393]: CRED_ACQ pid=11393 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.867833 kernel: audit: type=1106 audit(1707818799.404:1901): pid=11391 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.404000 audit[11391]: USER_END pid=11391 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.963285 kernel: audit: type=1104 audit(1707818799.404:1902): pid=11391 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.404000 audit[11391]: CRED_DISP pid=11391 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:39.405000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-139.178.70.43:22-139.178.68.195:36486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:42.855316 env[1473]: time="2024-02-13T10:06:42.855230115Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:06:42.881829 env[1473]: time="2024-02-13T10:06:42.881759338Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:42.881985 kubelet[2593]: E0213 10:06:42.881941 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:06:42.881985 kubelet[2593]: E0213 10:06:42.881968 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:06:42.882173 kubelet[2593]: E0213 10:06:42.881991 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:42.882173 kubelet[2593]: E0213 10:06:42.882009 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:06:43.854520 env[1473]: time="2024-02-13T10:06:43.854391456Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:06:43.875360 env[1473]: time="2024-02-13T10:06:43.875286042Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:43.875627 kubelet[2593]: E0213 10:06:43.875471 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:06:43.875627 kubelet[2593]: E0213 10:06:43.875505 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:06:43.875627 kubelet[2593]: E0213 10:06:43.875542 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:43.875627 kubelet[2593]: E0213 10:06:43.875569 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:06:44.412105 systemd[1]: Started sshd@76-139.178.70.43:22-139.178.68.195:36488.service. Feb 13 10:06:44.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-139.178.70.43:22-139.178.68.195:36488 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:44.439382 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:44.439452 kernel: audit: type=1130 audit(1707818804.411:1904): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-139.178.70.43:22-139.178.68.195:36488 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:44.549836 sshd[11474]: Accepted publickey for core from 139.178.68.195 port 36488 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:44.548000 audit[11474]: USER_ACCT pid=11474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.551619 sshd[11474]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:44.553889 systemd-logind[1461]: New session 79 of user core. Feb 13 10:06:44.554448 systemd[1]: Started session-79.scope. Feb 13 10:06:44.632430 sshd[11474]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:44.633942 systemd[1]: sshd@76-139.178.70.43:22-139.178.68.195:36488.service: Deactivated successfully. Feb 13 10:06:44.634376 systemd[1]: session-79.scope: Deactivated successfully. Feb 13 10:06:44.634763 systemd-logind[1461]: Session 79 logged out. Waiting for processes to exit. Feb 13 10:06:44.635618 systemd-logind[1461]: Removed session 79. Feb 13 10:06:44.550000 audit[11474]: CRED_ACQ pid=11474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.731950 kernel: audit: type=1101 audit(1707818804.548:1905): pid=11474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.731990 kernel: audit: type=1103 audit(1707818804.550:1906): pid=11474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.732010 kernel: audit: type=1006 audit(1707818804.550:1907): pid=11474 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=79 res=1 Feb 13 10:06:44.790545 kernel: audit: type=1300 audit(1707818804.550:1907): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd88fb05b0 a2=3 a3=0 items=0 ppid=1 pid=11474 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:44.550000 audit[11474]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd88fb05b0 a2=3 a3=0 items=0 ppid=1 pid=11474 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:44.882418 kernel: audit: type=1327 audit(1707818804.550:1907): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:44.550000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:44.912838 kernel: audit: type=1105 audit(1707818804.555:1908): pid=11474 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.555000 audit[11474]: USER_START pid=11474 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:45.007210 kernel: audit: type=1103 audit(1707818804.555:1909): pid=11476 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.555000 audit[11476]: CRED_ACQ pid=11476 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:45.096313 kernel: audit: type=1106 audit(1707818804.631:1910): pid=11474 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.631000 audit[11474]: USER_END pid=11474 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:45.191762 kernel: audit: type=1104 audit(1707818804.631:1911): pid=11474 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.631000 audit[11474]: CRED_DISP pid=11474 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:44.632000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-139.178.70.43:22-139.178.68.195:36488 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:49.642129 systemd[1]: Started sshd@77-139.178.70.43:22-139.178.68.195:48188.service. Feb 13 10:06:49.640000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-139.178.70.43:22-139.178.68.195:48188 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:49.669196 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:49.669262 kernel: audit: type=1130 audit(1707818809.640:1913): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-139.178.70.43:22-139.178.68.195:48188 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:49.777000 audit[11499]: USER_ACCT pid=11499 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:49.779169 sshd[11499]: Accepted publickey for core from 139.178.68.195 port 48188 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:49.780623 sshd[11499]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:49.783006 systemd-logind[1461]: New session 80 of user core. Feb 13 10:06:49.783553 systemd[1]: Started session-80.scope. Feb 13 10:06:49.862853 sshd[11499]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:49.864301 systemd[1]: sshd@77-139.178.70.43:22-139.178.68.195:48188.service: Deactivated successfully. Feb 13 10:06:49.864767 systemd[1]: session-80.scope: Deactivated successfully. Feb 13 10:06:49.865143 systemd-logind[1461]: Session 80 logged out. Waiting for processes to exit. Feb 13 10:06:49.866031 systemd-logind[1461]: Removed session 80. Feb 13 10:06:49.779000 audit[11499]: CRED_ACQ pid=11499 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:49.962721 kernel: audit: type=1101 audit(1707818809.777:1914): pid=11499 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:49.962761 kernel: audit: type=1103 audit(1707818809.779:1915): pid=11499 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:49.962780 kernel: audit: type=1006 audit(1707818809.779:1916): pid=11499 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=80 res=1 Feb 13 10:06:50.021350 kernel: audit: type=1300 audit(1707818809.779:1916): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff32052440 a2=3 a3=0 items=0 ppid=1 pid=11499 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:49.779000 audit[11499]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff32052440 a2=3 a3=0 items=0 ppid=1 pid=11499 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:50.113424 kernel: audit: type=1327 audit(1707818809.779:1916): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:49.779000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:49.784000 audit[11499]: USER_START pid=11499 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:50.238314 kernel: audit: type=1105 audit(1707818809.784:1917): pid=11499 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:50.238350 kernel: audit: type=1103 audit(1707818809.785:1918): pid=11501 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:49.785000 audit[11501]: CRED_ACQ pid=11501 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:49.862000 audit[11499]: USER_END pid=11499 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:50.423067 kernel: audit: type=1106 audit(1707818809.862:1919): pid=11499 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:50.423106 kernel: audit: type=1104 audit(1707818809.862:1920): pid=11499 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:49.862000 audit[11499]: CRED_DISP pid=11499 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:49.863000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-139.178.70.43:22-139.178.68.195:48188 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:50.855276 env[1473]: time="2024-02-13T10:06:50.855034185Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:06:50.906755 env[1473]: time="2024-02-13T10:06:50.906650270Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:50.906960 kubelet[2593]: E0213 10:06:50.906928 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:06:50.907365 kubelet[2593]: E0213 10:06:50.906977 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:06:50.907365 kubelet[2593]: E0213 10:06:50.907026 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:50.907365 kubelet[2593]: E0213 10:06:50.907067 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:06:52.855374 env[1473]: time="2024-02-13T10:06:52.855236563Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:06:52.910300 env[1473]: time="2024-02-13T10:06:52.910247184Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:52.910551 kubelet[2593]: E0213 10:06:52.910502 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:06:52.910551 kubelet[2593]: E0213 10:06:52.910544 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:06:52.910931 kubelet[2593]: E0213 10:06:52.910585 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:52.910931 kubelet[2593]: E0213 10:06:52.910619 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:06:54.854683 env[1473]: time="2024-02-13T10:06:54.854589923Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:06:54.854683 env[1473]: time="2024-02-13T10:06:54.854589917Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:06:54.875011 systemd[1]: Started sshd@78-139.178.70.43:22-139.178.68.195:48194.service. Feb 13 10:06:54.874000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-139.178.70.43:22-139.178.68.195:48194 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:54.899527 env[1473]: time="2024-02-13T10:06:54.899469082Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:54.899730 kubelet[2593]: E0213 10:06:54.899706 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:06:54.900205 kubelet[2593]: E0213 10:06:54.899761 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:06:54.900205 kubelet[2593]: E0213 10:06:54.899833 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:54.900205 kubelet[2593]: E0213 10:06:54.899882 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:06:54.901959 env[1473]: time="2024-02-13T10:06:54.901897174Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:06:54.902054 kubelet[2593]: E0213 10:06:54.902043 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:06:54.902115 kubelet[2593]: E0213 10:06:54.902067 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:06:54.902115 kubelet[2593]: E0213 10:06:54.902100 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:06:54.902241 kubelet[2593]: E0213 10:06:54.902125 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:06:54.907101 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:06:54.907167 kernel: audit: type=1130 audit(1707818814.874:1922): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-139.178.70.43:22-139.178.68.195:48194 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:55.014000 audit[11627]: USER_ACCT pid=11627 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.015888 sshd[11627]: Accepted publickey for core from 139.178.68.195 port 48194 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:06:55.018994 sshd[11627]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:06:55.021519 systemd-logind[1461]: New session 81 of user core. Feb 13 10:06:55.021986 systemd[1]: Started session-81.scope. Feb 13 10:06:55.098955 sshd[11627]: pam_unix(sshd:session): session closed for user core Feb 13 10:06:55.100399 systemd[1]: sshd@78-139.178.70.43:22-139.178.68.195:48194.service: Deactivated successfully. Feb 13 10:06:55.100816 systemd[1]: session-81.scope: Deactivated successfully. Feb 13 10:06:55.101133 systemd-logind[1461]: Session 81 logged out. Waiting for processes to exit. Feb 13 10:06:55.101646 systemd-logind[1461]: Removed session 81. Feb 13 10:06:55.014000 audit[11627]: CRED_ACQ pid=11627 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.197771 kernel: audit: type=1101 audit(1707818815.014:1923): pid=11627 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.197869 kernel: audit: type=1103 audit(1707818815.014:1924): pid=11627 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.197891 kernel: audit: type=1006 audit(1707818815.014:1925): pid=11627 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=81 res=1 Feb 13 10:06:55.256432 kernel: audit: type=1300 audit(1707818815.014:1925): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcad0c8c30 a2=3 a3=0 items=0 ppid=1 pid=11627 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:55.014000 audit[11627]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcad0c8c30 a2=3 a3=0 items=0 ppid=1 pid=11627 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:06:55.014000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:55.378918 kernel: audit: type=1327 audit(1707818815.014:1925): proctitle=737368643A20636F7265205B707269765D Feb 13 10:06:55.379012 kernel: audit: type=1105 audit(1707818815.021:1926): pid=11627 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.021000 audit[11627]: USER_START pid=11627 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.021000 audit[11649]: CRED_ACQ pid=11649 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.562747 kernel: audit: type=1103 audit(1707818815.021:1927): pid=11649 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.562797 kernel: audit: type=1106 audit(1707818815.098:1928): pid=11627 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.098000 audit[11627]: USER_END pid=11627 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.658434 kernel: audit: type=1104 audit(1707818815.098:1929): pid=11627 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.098000 audit[11627]: CRED_DISP pid=11627 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:06:55.098000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-139.178.70.43:22-139.178.68.195:48194 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:06:55.211000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:55.211000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0010a1dd0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:06:55.211000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:06:55.211000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:55.211000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000dfc8c0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:06:55.211000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:06:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00ef73120 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:06:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:06:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c01009c5d0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:06:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:06:55.438000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:55.438000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c011c70fc0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:06:55.438000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:06:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c011c71050 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:06:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:06:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c011c71080 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:06:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:06:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:06:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c00ef73140 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:06:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:07:00.108533 systemd[1]: Started sshd@79-139.178.70.43:22-139.178.68.195:58172.service. Feb 13 10:07:00.107000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-139.178.70.43:22-139.178.68.195:58172 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:00.135883 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 10:07:00.135959 kernel: audit: type=1130 audit(1707818820.107:1939): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-139.178.70.43:22-139.178.68.195:58172 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:00.243000 audit[11674]: USER_ACCT pid=11674 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.244900 sshd[11674]: Accepted publickey for core from 139.178.68.195 port 58172 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:00.246240 sshd[11674]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:00.248544 systemd-logind[1461]: New session 82 of user core. Feb 13 10:07:00.249029 systemd[1]: Started session-82.scope. Feb 13 10:07:00.326060 sshd[11674]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:00.327440 systemd[1]: sshd@79-139.178.70.43:22-139.178.68.195:58172.service: Deactivated successfully. Feb 13 10:07:00.327854 systemd[1]: session-82.scope: Deactivated successfully. Feb 13 10:07:00.328217 systemd-logind[1461]: Session 82 logged out. Waiting for processes to exit. Feb 13 10:07:00.328838 systemd-logind[1461]: Removed session 82. Feb 13 10:07:00.244000 audit[11674]: CRED_ACQ pid=11674 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.426822 kernel: audit: type=1101 audit(1707818820.243:1940): pid=11674 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.426871 kernel: audit: type=1103 audit(1707818820.244:1941): pid=11674 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.426889 kernel: audit: type=1006 audit(1707818820.244:1942): pid=11674 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=82 res=1 Feb 13 10:07:00.485410 kernel: audit: type=1300 audit(1707818820.244:1942): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcaa842040 a2=3 a3=0 items=0 ppid=1 pid=11674 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:00.244000 audit[11674]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcaa842040 a2=3 a3=0 items=0 ppid=1 pid=11674 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:00.577421 kernel: audit: type=1327 audit(1707818820.244:1942): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:00.244000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:00.607835 kernel: audit: type=1105 audit(1707818820.249:1943): pid=11674 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.249000 audit[11674]: USER_START pid=11674 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.702244 kernel: audit: type=1103 audit(1707818820.250:1944): pid=11676 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.250000 audit[11676]: CRED_ACQ pid=11676 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.791425 kernel: audit: type=1106 audit(1707818820.325:1945): pid=11674 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.325000 audit[11674]: USER_END pid=11674 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.886889 kernel: audit: type=1104 audit(1707818820.325:1946): pid=11674 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.325000 audit[11674]: CRED_DISP pid=11674 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:00.326000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-139.178.70.43:22-139.178.68.195:58172 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:02.855184 env[1473]: time="2024-02-13T10:07:02.855094345Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:07:02.870470 env[1473]: time="2024-02-13T10:07:02.870404486Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:02.870629 kubelet[2593]: E0213 10:07:02.870580 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:07:02.870629 kubelet[2593]: E0213 10:07:02.870612 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:07:02.870864 kubelet[2593]: E0213 10:07:02.870637 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:02.870864 kubelet[2593]: E0213 10:07:02.870672 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:07:03.854194 env[1473]: time="2024-02-13T10:07:03.854070875Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:07:03.880169 env[1473]: time="2024-02-13T10:07:03.880134095Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:03.880453 kubelet[2593]: E0213 10:07:03.880332 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:07:03.880453 kubelet[2593]: E0213 10:07:03.880382 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:07:03.880453 kubelet[2593]: E0213 10:07:03.880406 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:03.880453 kubelet[2593]: E0213 10:07:03.880425 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:07:05.336186 systemd[1]: Started sshd@80-139.178.70.43:22-139.178.68.195:58176.service. Feb 13 10:07:05.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-139.178.70.43:22-139.178.68.195:58176 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:05.363256 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:05.363296 kernel: audit: type=1130 audit(1707818825.334:1948): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-139.178.70.43:22-139.178.68.195:58176 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:05.472000 audit[11756]: USER_ACCT pid=11756 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.474410 sshd[11756]: Accepted publickey for core from 139.178.68.195 port 58176 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:05.475584 sshd[11756]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:05.478253 systemd-logind[1461]: New session 83 of user core. Feb 13 10:07:05.479289 systemd[1]: Started session-83.scope. Feb 13 10:07:05.474000 audit[11756]: CRED_ACQ pid=11756 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.656423 kernel: audit: type=1101 audit(1707818825.472:1949): pid=11756 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.656462 kernel: audit: type=1103 audit(1707818825.474:1950): pid=11756 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.656476 kernel: audit: type=1006 audit(1707818825.474:1951): pid=11756 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=83 res=1 Feb 13 10:07:05.715067 kernel: audit: type=1300 audit(1707818825.474:1951): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffdc2584b0 a2=3 a3=0 items=0 ppid=1 pid=11756 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=83 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:05.474000 audit[11756]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffdc2584b0 a2=3 a3=0 items=0 ppid=1 pid=11756 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=83 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:05.474000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:05.807247 sshd[11756]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:05.808913 systemd[1]: sshd@80-139.178.70.43:22-139.178.68.195:58176.service: Deactivated successfully. Feb 13 10:07:05.809550 systemd[1]: session-83.scope: Deactivated successfully. Feb 13 10:07:05.810015 systemd-logind[1461]: Session 83 logged out. Waiting for processes to exit. Feb 13 10:07:05.810503 systemd-logind[1461]: Removed session 83. Feb 13 10:07:05.837528 kernel: audit: type=1327 audit(1707818825.474:1951): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:05.837561 kernel: audit: type=1105 audit(1707818825.480:1952): pid=11756 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.480000 audit[11756]: USER_START pid=11756 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.931977 kernel: audit: type=1103 audit(1707818825.480:1953): pid=11758 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.480000 audit[11758]: CRED_ACQ pid=11758 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:06.021146 kernel: audit: type=1106 audit(1707818825.806:1954): pid=11756 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.806000 audit[11756]: USER_END pid=11756 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:06.116617 kernel: audit: type=1104 audit(1707818825.806:1955): pid=11756 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.806000 audit[11756]: CRED_DISP pid=11756 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:05.807000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-139.178.70.43:22-139.178.68.195:58176 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:06.855285 env[1473]: time="2024-02-13T10:07:06.855160584Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:07:06.880370 env[1473]: time="2024-02-13T10:07:06.880303952Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:06.880562 kubelet[2593]: E0213 10:07:06.880518 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:07:06.880562 kubelet[2593]: E0213 10:07:06.880545 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:07:06.880748 kubelet[2593]: E0213 10:07:06.880567 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:06.880748 kubelet[2593]: E0213 10:07:06.880584 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:07:08.854667 env[1473]: time="2024-02-13T10:07:08.854541133Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:07:08.879869 env[1473]: time="2024-02-13T10:07:08.879804004Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:08.880037 kubelet[2593]: E0213 10:07:08.879992 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:07:08.880037 kubelet[2593]: E0213 10:07:08.880018 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:07:08.880037 kubelet[2593]: E0213 10:07:08.880040 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:08.880262 kubelet[2593]: E0213 10:07:08.880058 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:07:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002dcaa60 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:07:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:07:09.618000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:09.618000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c002da7d20 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:07:09.618000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:07:09.620000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:09.620000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000dfd680 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:07:09.620000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:07:09.620000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:09.620000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c000dfd6a0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:07:09.620000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:07:10.759149 systemd[1]: Started sshd@81-139.178.70.43:22-139.178.68.195:57716.service. Feb 13 10:07:10.757000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-139.178.70.43:22-139.178.68.195:57716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:10.786224 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 10:07:10.786281 kernel: audit: type=1130 audit(1707818830.757:1961): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-139.178.70.43:22-139.178.68.195:57716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:10.895000 audit[11836]: USER_ACCT pid=11836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:10.896624 sshd[11836]: Accepted publickey for core from 139.178.68.195 port 57716 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:10.897717 sshd[11836]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:10.900116 systemd-logind[1461]: New session 84 of user core. Feb 13 10:07:10.900699 systemd[1]: Started session-84.scope. Feb 13 10:07:10.980690 sshd[11836]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:10.982117 systemd[1]: sshd@81-139.178.70.43:22-139.178.68.195:57716.service: Deactivated successfully. Feb 13 10:07:10.982605 systemd[1]: session-84.scope: Deactivated successfully. Feb 13 10:07:10.983109 systemd-logind[1461]: Session 84 logged out. Waiting for processes to exit. Feb 13 10:07:10.983685 systemd-logind[1461]: Removed session 84. Feb 13 10:07:10.896000 audit[11836]: CRED_ACQ pid=11836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:11.079902 kernel: audit: type=1101 audit(1707818830.895:1962): pid=11836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:11.079946 kernel: audit: type=1103 audit(1707818830.896:1963): pid=11836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:11.079970 kernel: audit: type=1006 audit(1707818830.896:1964): pid=11836 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=84 res=1 Feb 13 10:07:11.138529 kernel: audit: type=1300 audit(1707818830.896:1964): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe68cf74d0 a2=3 a3=0 items=0 ppid=1 pid=11836 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:10.896000 audit[11836]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe68cf74d0 a2=3 a3=0 items=0 ppid=1 pid=11836 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:11.230590 kernel: audit: type=1327 audit(1707818830.896:1964): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:10.896000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:11.261031 kernel: audit: type=1105 audit(1707818830.901:1965): pid=11836 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:10.901000 audit[11836]: USER_START pid=11836 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:11.355417 kernel: audit: type=1103 audit(1707818830.902:1966): pid=11838 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:10.902000 audit[11838]: CRED_ACQ pid=11838 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:11.444544 kernel: audit: type=1106 audit(1707818830.980:1967): pid=11836 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:10.980000 audit[11836]: USER_END pid=11836 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:11.540103 kernel: audit: type=1104 audit(1707818830.980:1968): pid=11836 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:10.980000 audit[11836]: CRED_DISP pid=11836 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:10.980000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-139.178.70.43:22-139.178.68.195:57716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:15.990375 systemd[1]: Started sshd@82-139.178.70.43:22-139.178.68.195:57732.service. Feb 13 10:07:15.989000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-139.178.70.43:22-139.178.68.195:57732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:16.017287 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:16.017394 kernel: audit: type=1130 audit(1707818835.989:1970): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-139.178.70.43:22-139.178.68.195:57732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:16.130000 audit[11863]: USER_ACCT pid=11863 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.131916 sshd[11863]: Accepted publickey for core from 139.178.68.195 port 57732 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:16.136754 sshd[11863]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:16.146311 systemd-logind[1461]: New session 85 of user core. Feb 13 10:07:16.149001 systemd[1]: Started session-85.scope. Feb 13 10:07:16.134000 audit[11863]: CRED_ACQ pid=11863 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.234637 sshd[11863]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:16.236035 systemd[1]: sshd@82-139.178.70.43:22-139.178.68.195:57732.service: Deactivated successfully. Feb 13 10:07:16.236458 systemd[1]: session-85.scope: Deactivated successfully. Feb 13 10:07:16.236854 systemd-logind[1461]: Session 85 logged out. Waiting for processes to exit. Feb 13 10:07:16.237239 systemd-logind[1461]: Removed session 85. Feb 13 10:07:16.313522 kernel: audit: type=1101 audit(1707818836.130:1971): pid=11863 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.313562 kernel: audit: type=1103 audit(1707818836.134:1972): pid=11863 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.313589 kernel: audit: type=1006 audit(1707818836.134:1973): pid=11863 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=85 res=1 Feb 13 10:07:16.134000 audit[11863]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff8c80e490 a2=3 a3=0 items=0 ppid=1 pid=11863 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:16.463990 kernel: audit: type=1300 audit(1707818836.134:1973): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff8c80e490 a2=3 a3=0 items=0 ppid=1 pid=11863 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:16.464030 kernel: audit: type=1327 audit(1707818836.134:1973): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:16.134000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:16.494417 kernel: audit: type=1105 audit(1707818836.156:1974): pid=11863 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.156000 audit[11863]: USER_START pid=11863 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.588921 kernel: audit: type=1103 audit(1707818836.158:1975): pid=11865 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.158000 audit[11865]: CRED_ACQ pid=11865 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.678089 kernel: audit: type=1106 audit(1707818836.233:1976): pid=11863 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.233000 audit[11863]: USER_END pid=11863 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.234000 audit[11863]: CRED_DISP pid=11863 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.774420 kernel: audit: type=1104 audit(1707818836.234:1977): pid=11863 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:16.853701 env[1473]: time="2024-02-13T10:07:16.853677322Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:07:16.234000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-139.178.70.43:22-139.178.68.195:57732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:16.865523 env[1473]: time="2024-02-13T10:07:16.865453160Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:16.865644 kubelet[2593]: E0213 10:07:16.865628 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:07:16.865814 kubelet[2593]: E0213 10:07:16.865655 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:07:16.865814 kubelet[2593]: E0213 10:07:16.865678 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:16.865814 kubelet[2593]: E0213 10:07:16.865697 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:07:17.854402 env[1473]: time="2024-02-13T10:07:17.854278044Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:07:17.854402 env[1473]: time="2024-02-13T10:07:17.854278037Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:07:17.880567 env[1473]: time="2024-02-13T10:07:17.880510645Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:17.880567 env[1473]: time="2024-02-13T10:07:17.880559507Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:17.880727 kubelet[2593]: E0213 10:07:17.880695 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:07:17.880900 kubelet[2593]: E0213 10:07:17.880727 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:07:17.880900 kubelet[2593]: E0213 10:07:17.880758 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:17.880900 kubelet[2593]: E0213 10:07:17.880694 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:07:17.880900 kubelet[2593]: E0213 10:07:17.880782 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:07:17.880900 kubelet[2593]: E0213 10:07:17.880785 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:07:17.881036 kubelet[2593]: E0213 10:07:17.880802 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:17.881036 kubelet[2593]: E0213 10:07:17.880816 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:07:20.854783 env[1473]: time="2024-02-13T10:07:20.854662373Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:07:20.904838 env[1473]: time="2024-02-13T10:07:20.904744602Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:20.905077 kubelet[2593]: E0213 10:07:20.905025 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:07:20.905077 kubelet[2593]: E0213 10:07:20.905073 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:07:20.905523 kubelet[2593]: E0213 10:07:20.905127 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:20.905523 kubelet[2593]: E0213 10:07:20.905170 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:07:21.245034 systemd[1]: Started sshd@83-139.178.70.43:22-139.178.68.195:49198.service. Feb 13 10:07:21.244000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-139.178.70.43:22-139.178.68.195:49198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:21.271670 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:21.271708 kernel: audit: type=1130 audit(1707818841.244:1979): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-139.178.70.43:22-139.178.68.195:49198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:21.379000 audit[12009]: USER_ACCT pid=12009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.380924 sshd[12009]: Accepted publickey for core from 139.178.68.195 port 49198 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:21.382131 sshd[12009]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:21.384898 systemd-logind[1461]: New session 86 of user core. Feb 13 10:07:21.385508 systemd[1]: Started session-86.scope. Feb 13 10:07:21.469665 sshd[12009]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:21.471145 systemd[1]: sshd@83-139.178.70.43:22-139.178.68.195:49198.service: Deactivated successfully. Feb 13 10:07:21.471657 systemd[1]: session-86.scope: Deactivated successfully. Feb 13 10:07:21.472038 systemd-logind[1461]: Session 86 logged out. Waiting for processes to exit. Feb 13 10:07:21.472898 systemd-logind[1461]: Removed session 86. Feb 13 10:07:21.380000 audit[12009]: CRED_ACQ pid=12009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.564850 kernel: audit: type=1101 audit(1707818841.379:1980): pid=12009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.564898 kernel: audit: type=1103 audit(1707818841.380:1981): pid=12009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.564920 kernel: audit: type=1006 audit(1707818841.380:1982): pid=12009 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=86 res=1 Feb 13 10:07:21.623429 kernel: audit: type=1300 audit(1707818841.380:1982): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeec1b8990 a2=3 a3=0 items=0 ppid=1 pid=12009 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:21.380000 audit[12009]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeec1b8990 a2=3 a3=0 items=0 ppid=1 pid=12009 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:21.715423 kernel: audit: type=1327 audit(1707818841.380:1982): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:21.380000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:21.745872 kernel: audit: type=1105 audit(1707818841.388:1983): pid=12009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.388000 audit[12009]: USER_START pid=12009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.840290 kernel: audit: type=1103 audit(1707818841.388:1984): pid=12011 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.388000 audit[12011]: CRED_ACQ pid=12011 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.929505 kernel: audit: type=1106 audit(1707818841.469:1985): pid=12009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.469000 audit[12009]: USER_END pid=12009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:22.025018 kernel: audit: type=1104 audit(1707818841.469:1986): pid=12009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.469000 audit[12009]: CRED_DISP pid=12009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:21.469000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-139.178.70.43:22-139.178.68.195:49198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:26.480511 systemd[1]: Started sshd@84-139.178.70.43:22-139.178.68.195:51796.service. Feb 13 10:07:26.479000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-139.178.70.43:22-139.178.68.195:51796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:26.515432 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:26.515540 kernel: audit: type=1130 audit(1707818846.479:1988): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-139.178.70.43:22-139.178.68.195:51796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:26.622000 audit[12033]: USER_ACCT pid=12033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.624307 sshd[12033]: Accepted publickey for core from 139.178.68.195 port 51796 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:26.626652 sshd[12033]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:26.629002 systemd-logind[1461]: New session 87 of user core. Feb 13 10:07:26.629574 systemd[1]: Started session-87.scope. Feb 13 10:07:26.625000 audit[12033]: CRED_ACQ pid=12033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.717666 sshd[12033]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:26.719065 systemd[1]: sshd@84-139.178.70.43:22-139.178.68.195:51796.service: Deactivated successfully. Feb 13 10:07:26.719499 systemd[1]: session-87.scope: Deactivated successfully. Feb 13 10:07:26.719878 systemd-logind[1461]: Session 87 logged out. Waiting for processes to exit. Feb 13 10:07:26.720294 systemd-logind[1461]: Removed session 87. Feb 13 10:07:26.806178 kernel: audit: type=1101 audit(1707818846.622:1989): pid=12033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.806222 kernel: audit: type=1103 audit(1707818846.625:1990): pid=12033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.806242 kernel: audit: type=1006 audit(1707818846.625:1991): pid=12033 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=87 res=1 Feb 13 10:07:26.625000 audit[12033]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2d2f06c0 a2=3 a3=0 items=0 ppid=1 pid=12033 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:26.956825 kernel: audit: type=1300 audit(1707818846.625:1991): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2d2f06c0 a2=3 a3=0 items=0 ppid=1 pid=12033 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:26.956861 kernel: audit: type=1327 audit(1707818846.625:1991): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:26.625000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:26.987262 kernel: audit: type=1105 audit(1707818846.630:1992): pid=12033 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.630000 audit[12033]: USER_START pid=12033 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:27.081736 kernel: audit: type=1103 audit(1707818846.630:1993): pid=12035 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.630000 audit[12035]: CRED_ACQ pid=12035 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:27.170948 kernel: audit: type=1106 audit(1707818846.716:1994): pid=12033 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.716000 audit[12033]: USER_END pid=12033 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:27.266504 kernel: audit: type=1104 audit(1707818846.717:1995): pid=12033 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.717000 audit[12033]: CRED_DISP pid=12033 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:26.717000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-139.178.70.43:22-139.178.68.195:51796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:28.855294 env[1473]: time="2024-02-13T10:07:28.855164598Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:07:28.869737 env[1473]: time="2024-02-13T10:07:28.869698461Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:28.869916 kubelet[2593]: E0213 10:07:28.869860 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:07:28.869916 kubelet[2593]: E0213 10:07:28.869887 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:07:28.869916 kubelet[2593]: E0213 10:07:28.869911 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:28.870180 kubelet[2593]: E0213 10:07:28.869931 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:07:31.727047 systemd[1]: Started sshd@85-139.178.70.43:22-139.178.68.195:51804.service. Feb 13 10:07:31.725000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-139.178.70.43:22-139.178.68.195:51804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:31.754160 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:31.754215 kernel: audit: type=1130 audit(1707818851.725:1997): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-139.178.70.43:22-139.178.68.195:51804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:31.853395 env[1473]: time="2024-02-13T10:07:31.853327099Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:07:31.853562 env[1473]: time="2024-02-13T10:07:31.853412645Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:07:31.863000 audit[12085]: USER_ACCT pid=12085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:31.865287 sshd[12085]: Accepted publickey for core from 139.178.68.195 port 51804 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:31.866697 sshd[12085]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:31.867128 env[1473]: time="2024-02-13T10:07:31.867095829Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:31.867200 env[1473]: time="2024-02-13T10:07:31.867096032Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:31.867300 kubelet[2593]: E0213 10:07:31.867287 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:07:31.867507 kubelet[2593]: E0213 10:07:31.867317 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:07:31.867507 kubelet[2593]: E0213 10:07:31.867287 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:07:31.867507 kubelet[2593]: E0213 10:07:31.867346 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:31.867507 kubelet[2593]: E0213 10:07:31.867361 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:07:31.867507 kubelet[2593]: E0213 10:07:31.867372 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:07:31.867715 kubelet[2593]: E0213 10:07:31.867388 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:31.867715 kubelet[2593]: E0213 10:07:31.867403 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:07:31.868988 systemd-logind[1461]: New session 88 of user core. Feb 13 10:07:31.869447 systemd[1]: Started session-88.scope. Feb 13 10:07:31.865000 audit[12085]: CRED_ACQ pid=12085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:31.958731 sshd[12085]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:31.960073 systemd[1]: sshd@85-139.178.70.43:22-139.178.68.195:51804.service: Deactivated successfully. Feb 13 10:07:31.960496 systemd[1]: session-88.scope: Deactivated successfully. Feb 13 10:07:31.960844 systemd-logind[1461]: Session 88 logged out. Waiting for processes to exit. Feb 13 10:07:31.961239 systemd-logind[1461]: Removed session 88. Feb 13 10:07:32.047240 kernel: audit: type=1101 audit(1707818851.863:1998): pid=12085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:32.047281 kernel: audit: type=1103 audit(1707818851.865:1999): pid=12085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:32.047301 kernel: audit: type=1006 audit(1707818851.865:2000): pid=12085 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=88 res=1 Feb 13 10:07:32.105881 kernel: audit: type=1300 audit(1707818851.865:2000): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd832dafd0 a2=3 a3=0 items=0 ppid=1 pid=12085 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:31.865000 audit[12085]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd832dafd0 a2=3 a3=0 items=0 ppid=1 pid=12085 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:32.197944 kernel: audit: type=1327 audit(1707818851.865:2000): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:31.865000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:32.228425 kernel: audit: type=1105 audit(1707818851.870:2001): pid=12085 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:31.870000 audit[12085]: USER_START pid=12085 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:32.323087 kernel: audit: type=1103 audit(1707818851.870:2002): pid=12146 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:31.870000 audit[12146]: CRED_ACQ pid=12146 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:32.412293 kernel: audit: type=1106 audit(1707818851.958:2003): pid=12085 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:31.958000 audit[12085]: USER_END pid=12085 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:32.507821 kernel: audit: type=1104 audit(1707818851.958:2004): pid=12085 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:31.958000 audit[12085]: CRED_DISP pid=12085 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:31.958000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-139.178.70.43:22-139.178.68.195:51804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:32.855034 env[1473]: time="2024-02-13T10:07:32.854805858Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:07:32.881103 env[1473]: time="2024-02-13T10:07:32.881067501Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:32.881302 kubelet[2593]: E0213 10:07:32.881291 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:07:32.881522 kubelet[2593]: E0213 10:07:32.881319 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:07:32.881522 kubelet[2593]: E0213 10:07:32.881359 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:32.881522 kubelet[2593]: E0213 10:07:32.881411 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:07:36.968354 systemd[1]: Started sshd@86-139.178.70.43:22-139.178.68.195:52660.service. Feb 13 10:07:36.967000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-139.178.70.43:22-139.178.68.195:52660 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:36.995177 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:36.995243 kernel: audit: type=1130 audit(1707818856.967:2006): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-139.178.70.43:22-139.178.68.195:52660 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:37.104000 audit[12198]: USER_ACCT pid=12198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.105531 sshd[12198]: Accepted publickey for core from 139.178.68.195 port 52660 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:37.106632 sshd[12198]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:37.109150 systemd-logind[1461]: New session 89 of user core. Feb 13 10:07:37.109729 systemd[1]: Started session-89.scope. Feb 13 10:07:37.190189 sshd[12198]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:37.191667 systemd[1]: sshd@86-139.178.70.43:22-139.178.68.195:52660.service: Deactivated successfully. Feb 13 10:07:37.192143 systemd[1]: session-89.scope: Deactivated successfully. Feb 13 10:07:37.192519 systemd-logind[1461]: Session 89 logged out. Waiting for processes to exit. Feb 13 10:07:37.192948 systemd-logind[1461]: Removed session 89. Feb 13 10:07:37.105000 audit[12198]: CRED_ACQ pid=12198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.289751 kernel: audit: type=1101 audit(1707818857.104:2007): pid=12198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.289791 kernel: audit: type=1103 audit(1707818857.105:2008): pid=12198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.289824 kernel: audit: type=1006 audit(1707818857.105:2009): pid=12198 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=89 res=1 Feb 13 10:07:37.105000 audit[12198]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc5d08c5e0 a2=3 a3=0 items=0 ppid=1 pid=12198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:37.440420 kernel: audit: type=1300 audit(1707818857.105:2009): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc5d08c5e0 a2=3 a3=0 items=0 ppid=1 pid=12198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:37.440458 kernel: audit: type=1327 audit(1707818857.105:2009): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:37.105000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:37.470871 kernel: audit: type=1105 audit(1707818857.110:2010): pid=12198 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.110000 audit[12198]: USER_START pid=12198 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.565349 kernel: audit: type=1103 audit(1707818857.111:2011): pid=12200 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.111000 audit[12200]: CRED_ACQ pid=12200 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.654584 kernel: audit: type=1106 audit(1707818857.189:2012): pid=12198 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.189000 audit[12198]: USER_END pid=12198 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.750098 kernel: audit: type=1104 audit(1707818857.189:2013): pid=12198 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.189000 audit[12198]: CRED_DISP pid=12198 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:37.190000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-139.178.70.43:22-139.178.68.195:52660 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:39.854513 env[1473]: time="2024-02-13T10:07:39.854413829Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:07:39.880528 env[1473]: time="2024-02-13T10:07:39.880494034Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:39.880773 kubelet[2593]: E0213 10:07:39.880726 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:07:39.880953 kubelet[2593]: E0213 10:07:39.880793 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:07:39.880953 kubelet[2593]: E0213 10:07:39.880814 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:39.880953 kubelet[2593]: E0213 10:07:39.880832 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:07:42.200654 systemd[1]: Started sshd@87-139.178.70.43:22-139.178.68.195:52672.service. Feb 13 10:07:42.199000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-139.178.70.43:22-139.178.68.195:52672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:42.227704 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:42.227804 kernel: audit: type=1130 audit(1707818862.199:2015): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-139.178.70.43:22-139.178.68.195:52672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:42.335000 audit[12253]: USER_ACCT pid=12253 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.337473 sshd[12253]: Accepted publickey for core from 139.178.68.195 port 52672 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:42.339213 sshd[12253]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:42.341698 systemd-logind[1461]: New session 90 of user core. Feb 13 10:07:42.342152 systemd[1]: Started session-90.scope. Feb 13 10:07:42.337000 audit[12253]: CRED_ACQ pid=12253 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.519305 kernel: audit: type=1101 audit(1707818862.335:2016): pid=12253 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.519350 kernel: audit: type=1103 audit(1707818862.337:2017): pid=12253 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.519372 kernel: audit: type=1006 audit(1707818862.337:2018): pid=12253 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=90 res=1 Feb 13 10:07:42.577944 kernel: audit: type=1300 audit(1707818862.337:2018): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe110d9430 a2=3 a3=0 items=0 ppid=1 pid=12253 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:42.337000 audit[12253]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe110d9430 a2=3 a3=0 items=0 ppid=1 pid=12253 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:42.669969 kernel: audit: type=1327 audit(1707818862.337:2018): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:42.337000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:42.670202 sshd[12253]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:42.671717 systemd[1]: sshd@87-139.178.70.43:22-139.178.68.195:52672.service: Deactivated successfully. Feb 13 10:07:42.672177 systemd[1]: session-90.scope: Deactivated successfully. Feb 13 10:07:42.672547 systemd-logind[1461]: Session 90 logged out. Waiting for processes to exit. Feb 13 10:07:42.673058 systemd-logind[1461]: Removed session 90. Feb 13 10:07:42.342000 audit[12253]: USER_START pid=12253 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.794942 kernel: audit: type=1105 audit(1707818862.342:2019): pid=12253 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.794976 kernel: audit: type=1103 audit(1707818862.343:2020): pid=12255 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.343000 audit[12255]: CRED_ACQ pid=12255 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.853709 env[1473]: time="2024-02-13T10:07:42.853665367Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:07:42.853867 env[1473]: time="2024-02-13T10:07:42.853733565Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:07:42.866063 env[1473]: time="2024-02-13T10:07:42.866026748Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:42.866217 kubelet[2593]: E0213 10:07:42.866205 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:07:42.866398 kubelet[2593]: E0213 10:07:42.866234 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:07:42.866398 kubelet[2593]: E0213 10:07:42.866257 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:42.866398 kubelet[2593]: E0213 10:07:42.866273 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:07:42.870457 env[1473]: time="2024-02-13T10:07:42.870392010Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:42.870560 kubelet[2593]: E0213 10:07:42.870523 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:07:42.870560 kubelet[2593]: E0213 10:07:42.870539 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:07:42.870560 kubelet[2593]: E0213 10:07:42.870556 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:42.870652 kubelet[2593]: E0213 10:07:42.870571 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:07:42.884408 kernel: audit: type=1106 audit(1707818862.669:2021): pid=12253 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.669000 audit[12253]: USER_END pid=12253 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.979931 kernel: audit: type=1104 audit(1707818862.669:2022): pid=12253 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.669000 audit[12253]: CRED_DISP pid=12253 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:42.670000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-139.178.70.43:22-139.178.68.195:52672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:46.855263 env[1473]: time="2024-02-13T10:07:46.855173783Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:07:46.870257 env[1473]: time="2024-02-13T10:07:46.870222055Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:46.870421 kubelet[2593]: E0213 10:07:46.870408 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:07:46.870615 kubelet[2593]: E0213 10:07:46.870442 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:07:46.870615 kubelet[2593]: E0213 10:07:46.870475 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:46.870615 kubelet[2593]: E0213 10:07:46.870506 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:07:47.621288 systemd[1]: Started sshd@88-139.178.70.43:22-139.178.68.195:59614.service. Feb 13 10:07:47.620000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-139.178.70.43:22-139.178.68.195:59614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:47.648273 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:47.648316 kernel: audit: type=1130 audit(1707818867.620:2024): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-139.178.70.43:22-139.178.68.195:59614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:47.758000 audit[12364]: USER_ACCT pid=12364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.758827 sshd[12364]: Accepted publickey for core from 139.178.68.195 port 59614 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:47.761217 sshd[12364]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:47.763582 systemd-logind[1461]: New session 91 of user core. Feb 13 10:07:47.764228 systemd[1]: Started session-91.scope. Feb 13 10:07:47.842663 sshd[12364]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:47.843985 systemd[1]: sshd@88-139.178.70.43:22-139.178.68.195:59614.service: Deactivated successfully. Feb 13 10:07:47.844421 systemd[1]: session-91.scope: Deactivated successfully. Feb 13 10:07:47.844822 systemd-logind[1461]: Session 91 logged out. Waiting for processes to exit. Feb 13 10:07:47.845277 systemd-logind[1461]: Removed session 91. Feb 13 10:07:47.759000 audit[12364]: CRED_ACQ pid=12364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.941727 kernel: audit: type=1101 audit(1707818867.758:2025): pid=12364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.941762 kernel: audit: type=1103 audit(1707818867.759:2026): pid=12364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.941782 kernel: audit: type=1006 audit(1707818867.759:2027): pid=12364 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=91 res=1 Feb 13 10:07:48.000421 kernel: audit: type=1300 audit(1707818867.759:2027): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc247c0f10 a2=3 a3=0 items=0 ppid=1 pid=12364 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:47.759000 audit[12364]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc247c0f10 a2=3 a3=0 items=0 ppid=1 pid=12364 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:48.092425 kernel: audit: type=1327 audit(1707818867.759:2027): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:47.759000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:48.122888 kernel: audit: type=1105 audit(1707818867.765:2028): pid=12364 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.765000 audit[12364]: USER_START pid=12364 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:48.217424 kernel: audit: type=1103 audit(1707818867.765:2029): pid=12366 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.765000 audit[12366]: CRED_ACQ pid=12366 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:48.306669 kernel: audit: type=1106 audit(1707818867.841:2030): pid=12364 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.841000 audit[12364]: USER_END pid=12364 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:48.402253 kernel: audit: type=1104 audit(1707818867.841:2031): pid=12364 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.841000 audit[12364]: CRED_DISP pid=12364 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:47.842000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-139.178.70.43:22-139.178.68.195:59614 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:52.852433 systemd[1]: Started sshd@89-139.178.70.43:22-139.178.68.195:59618.service. Feb 13 10:07:52.851000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-139.178.70.43:22-139.178.68.195:59618 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:52.894713 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:07:52.894787 kernel: audit: type=1130 audit(1707818872.851:2033): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-139.178.70.43:22-139.178.68.195:59618 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:53.002000 audit[12388]: USER_ACCT pid=12388 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.004235 sshd[12388]: Accepted publickey for core from 139.178.68.195 port 59618 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:53.006377 sshd[12388]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:53.008747 systemd-logind[1461]: New session 92 of user core. Feb 13 10:07:53.009207 systemd[1]: Started session-92.scope. Feb 13 10:07:53.090552 sshd[12388]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:53.091944 systemd[1]: sshd@89-139.178.70.43:22-139.178.68.195:59618.service: Deactivated successfully. Feb 13 10:07:53.092366 systemd[1]: session-92.scope: Deactivated successfully. Feb 13 10:07:53.092783 systemd-logind[1461]: Session 92 logged out. Waiting for processes to exit. Feb 13 10:07:53.093248 systemd-logind[1461]: Removed session 92. Feb 13 10:07:53.004000 audit[12388]: CRED_ACQ pid=12388 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.186213 kernel: audit: type=1101 audit(1707818873.002:2034): pid=12388 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.186282 kernel: audit: type=1103 audit(1707818873.004:2035): pid=12388 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.186302 kernel: audit: type=1006 audit(1707818873.004:2036): pid=12388 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=92 res=1 Feb 13 10:07:53.244921 kernel: audit: type=1300 audit(1707818873.004:2036): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff02eec330 a2=3 a3=0 items=0 ppid=1 pid=12388 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:53.004000 audit[12388]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff02eec330 a2=3 a3=0 items=0 ppid=1 pid=12388 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:53.004000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:53.367523 kernel: audit: type=1327 audit(1707818873.004:2036): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:53.367573 kernel: audit: type=1105 audit(1707818873.010:2037): pid=12388 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.010000 audit[12388]: USER_START pid=12388 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.010000 audit[12390]: CRED_ACQ pid=12390 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.551512 kernel: audit: type=1103 audit(1707818873.010:2038): pid=12390 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.551575 kernel: audit: type=1106 audit(1707818873.089:2039): pid=12388 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.089000 audit[12388]: USER_END pid=12388 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.089000 audit[12388]: CRED_DISP pid=12388 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.736424 kernel: audit: type=1104 audit(1707818873.089:2040): pid=12388 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:53.090000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-139.178.70.43:22-139.178.68.195:59618 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:53.854960 env[1473]: time="2024-02-13T10:07:53.854787077Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:07:53.879291 env[1473]: time="2024-02-13T10:07:53.879231714Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:53.879420 kubelet[2593]: E0213 10:07:53.879407 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:07:53.879619 kubelet[2593]: E0213 10:07:53.879438 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:07:53.879619 kubelet[2593]: E0213 10:07:53.879473 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:53.879619 kubelet[2593]: E0213 10:07:53.879501 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:07:55.211000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:55.211000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0006c7770 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:07:55.211000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:07:55.212000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:55.212000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0016f43c0 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:07:55.212000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:07:55.440000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:55.440000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:55.440000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0124bab40 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:07:55.440000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c001a3b720 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:07:55.440000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:07:55.440000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:07:55.441000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:55.441000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0124baba0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:07:55.441000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:07:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c013e600c0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:07:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:07:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c007a4e390 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:07:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:07:55.443000 audit[2425]: AVC avc: denied { watch } for pid=2425 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:07:55.443000 audit[2425]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=69 a1=c0049460f0 a2=fc6 a3=0 items=0 ppid=2247 pid=2425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c599,c785 key=(null) Feb 13 10:07:55.443000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3133392E3137382E37302E3433002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B75 Feb 13 10:07:56.854854 env[1473]: time="2024-02-13T10:07:56.854753368Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:07:56.906796 env[1473]: time="2024-02-13T10:07:56.906695620Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:56.907039 kubelet[2593]: E0213 10:07:56.906975 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:07:56.907039 kubelet[2593]: E0213 10:07:56.907025 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:07:56.907476 kubelet[2593]: E0213 10:07:56.907090 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:56.907476 kubelet[2593]: E0213 10:07:56.907131 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:07:57.854852 env[1473]: time="2024-02-13T10:07:57.854722645Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:07:57.910209 env[1473]: time="2024-02-13T10:07:57.910072919Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:57.910960 kubelet[2593]: E0213 10:07:57.910529 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:07:57.910960 kubelet[2593]: E0213 10:07:57.910614 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:07:57.910960 kubelet[2593]: E0213 10:07:57.910703 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:57.910960 kubelet[2593]: E0213 10:07:57.910772 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:07:58.100790 systemd[1]: Started sshd@90-139.178.70.43:22-139.178.68.195:54322.service. Feb 13 10:07:58.099000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-139.178.70.43:22-139.178.68.195:54322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:58.127837 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 10:07:58.127907 kernel: audit: type=1130 audit(1707818878.099:2050): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-139.178.70.43:22-139.178.68.195:54322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:07:58.256000 audit[12500]: USER_ACCT pid=12500 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.257606 sshd[12500]: Accepted publickey for core from 139.178.68.195 port 54322 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:07:58.259316 sshd[12500]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:07:58.263080 systemd-logind[1461]: New session 93 of user core. Feb 13 10:07:58.264007 systemd[1]: Started session-93.scope. Feb 13 10:07:58.257000 audit[12500]: CRED_ACQ pid=12500 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.441580 kernel: audit: type=1101 audit(1707818878.256:2051): pid=12500 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.441629 kernel: audit: type=1103 audit(1707818878.257:2052): pid=12500 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.441647 kernel: audit: type=1006 audit(1707818878.257:2053): pid=12500 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=93 res=1 Feb 13 10:07:58.257000 audit[12500]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff7c51990 a2=3 a3=0 items=0 ppid=1 pid=12500 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:58.592310 kernel: audit: type=1300 audit(1707818878.257:2053): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff7c51990 a2=3 a3=0 items=0 ppid=1 pid=12500 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:07:58.592387 kernel: audit: type=1327 audit(1707818878.257:2053): proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:58.257000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:07:58.592543 sshd[12500]: pam_unix(sshd:session): session closed for user core Feb 13 10:07:58.593984 systemd[1]: sshd@90-139.178.70.43:22-139.178.68.195:54322.service: Deactivated successfully. Feb 13 10:07:58.594417 systemd[1]: session-93.scope: Deactivated successfully. Feb 13 10:07:58.594827 systemd-logind[1461]: Session 93 logged out. Waiting for processes to exit. Feb 13 10:07:58.595280 systemd-logind[1461]: Removed session 93. Feb 13 10:07:58.622813 kernel: audit: type=1105 audit(1707818878.266:2054): pid=12500 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.266000 audit[12500]: USER_START pid=12500 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.717327 kernel: audit: type=1103 audit(1707818878.267:2055): pid=12502 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.267000 audit[12502]: CRED_ACQ pid=12502 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.806559 kernel: audit: type=1106 audit(1707818878.591:2056): pid=12500 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.591000 audit[12500]: USER_END pid=12500 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.854315 env[1473]: time="2024-02-13T10:07:58.854294415Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:07:58.866664 env[1473]: time="2024-02-13T10:07:58.866606461Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:07:58.866760 kubelet[2593]: E0213 10:07:58.866695 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:07:58.866760 kubelet[2593]: E0213 10:07:58.866720 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:07:58.866835 kubelet[2593]: E0213 10:07:58.866764 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:07:58.866835 kubelet[2593]: E0213 10:07:58.866790 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:07:58.902128 kernel: audit: type=1104 audit(1707818878.592:2057): pid=12500 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.592000 audit[12500]: CRED_DISP pid=12500 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:07:58.592000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-139.178.70.43:22-139.178.68.195:54322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:03.543511 systemd[1]: Started sshd@91-139.178.70.43:22-139.178.68.195:54332.service. Feb 13 10:08:03.542000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-139.178.70.43:22-139.178.68.195:54332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:03.570396 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:08:03.570503 kernel: audit: type=1130 audit(1707818883.542:2059): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-139.178.70.43:22-139.178.68.195:54332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:03.679000 audit[12554]: USER_ACCT pid=12554 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.680532 sshd[12554]: Accepted publickey for core from 139.178.68.195 port 54332 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:08:03.682623 sshd[12554]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:08:03.684918 systemd-logind[1461]: New session 94 of user core. Feb 13 10:08:03.685541 systemd[1]: Started session-94.scope. Feb 13 10:08:03.766534 sshd[12554]: pam_unix(sshd:session): session closed for user core Feb 13 10:08:03.768018 systemd[1]: sshd@91-139.178.70.43:22-139.178.68.195:54332.service: Deactivated successfully. Feb 13 10:08:03.768517 systemd[1]: session-94.scope: Deactivated successfully. Feb 13 10:08:03.768972 systemd-logind[1461]: Session 94 logged out. Waiting for processes to exit. Feb 13 10:08:03.769446 systemd-logind[1461]: Removed session 94. Feb 13 10:08:03.681000 audit[12554]: CRED_ACQ pid=12554 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.774344 kernel: audit: type=1101 audit(1707818883.679:2060): pid=12554 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.774402 kernel: audit: type=1103 audit(1707818883.681:2061): pid=12554 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.923203 kernel: audit: type=1006 audit(1707818883.681:2062): pid=12554 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=94 res=1 Feb 13 10:08:03.923238 kernel: audit: type=1300 audit(1707818883.681:2062): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd23860e00 a2=3 a3=0 items=0 ppid=1 pid=12554 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:08:03.681000 audit[12554]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd23860e00 a2=3 a3=0 items=0 ppid=1 pid=12554 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:08:04.015352 kernel: audit: type=1327 audit(1707818883.681:2062): proctitle=737368643A20636F7265205B707269765D Feb 13 10:08:03.681000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:08:04.045840 kernel: audit: type=1105 audit(1707818883.686:2063): pid=12554 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.686000 audit[12554]: USER_START pid=12554 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:04.140334 kernel: audit: type=1103 audit(1707818883.687:2064): pid=12556 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.687000 audit[12556]: CRED_ACQ pid=12556 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.765000 audit[12554]: USER_END pid=12554 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:04.325119 kernel: audit: type=1106 audit(1707818883.765:2065): pid=12554 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:04.325156 kernel: audit: type=1104 audit(1707818883.765:2066): pid=12554 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.765000 audit[12554]: CRED_DISP pid=12554 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:03.766000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-139.178.70.43:22-139.178.68.195:54332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:08.776440 systemd[1]: Started sshd@92-139.178.70.43:22-139.178.68.195:57456.service. Feb 13 10:08:08.775000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-139.178.70.43:22-139.178.68.195:57456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:08.803164 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 10:08:08.803226 kernel: audit: type=1130 audit(1707818888.775:2068): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-139.178.70.43:22-139.178.68.195:57456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:08.853746 env[1473]: time="2024-02-13T10:08:08.853674529Z" level=info msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\"" Feb 13 10:08:08.904769 env[1473]: time="2024-02-13T10:08:08.904695198Z" level=error msg="StopPodSandbox for \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\" failed" error="failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:08:08.904858 kubelet[2593]: E0213 10:08:08.904846 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654" Feb 13 10:08:08.905014 kubelet[2593]: E0213 10:08:08.904868 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654} Feb 13 10:08:08.905014 kubelet[2593]: E0213 10:08:08.904888 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:08:08.905014 kubelet[2593]: E0213 10:08:08.904904 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ac15c9fc-cc5d-4a8f-ac09-16f6497ee733\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d12d1d6e2c1c98279cfb62e102b6547ff1e7dce9060d46ec25c8be5cfec6654\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-zxn6w" podUID=ac15c9fc-cc5d-4a8f-ac09-16f6497ee733 Feb 13 10:08:08.913000 audit[12579]: USER_ACCT pid=12579 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:08.913915 sshd[12579]: Accepted publickey for core from 139.178.68.195 port 57456 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:08:08.915646 sshd[12579]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:08:08.918010 systemd-logind[1461]: New session 95 of user core. Feb 13 10:08:08.918435 systemd[1]: Started session-95.scope. Feb 13 10:08:08.996437 sshd[12579]: pam_unix(sshd:session): session closed for user core Feb 13 10:08:08.997876 systemd[1]: sshd@92-139.178.70.43:22-139.178.68.195:57456.service: Deactivated successfully. Feb 13 10:08:08.998290 systemd[1]: session-95.scope: Deactivated successfully. Feb 13 10:08:08.998653 systemd-logind[1461]: Session 95 logged out. Waiting for processes to exit. Feb 13 10:08:08.999168 systemd-logind[1461]: Removed session 95. Feb 13 10:08:08.915000 audit[12579]: CRED_ACQ pid=12579 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:09.095984 kernel: audit: type=1101 audit(1707818888.913:2069): pid=12579 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:09.096025 kernel: audit: type=1103 audit(1707818888.915:2070): pid=12579 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:09.096043 kernel: audit: type=1006 audit(1707818888.915:2071): pid=12579 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=95 res=1 Feb 13 10:08:09.154635 kernel: audit: type=1300 audit(1707818888.915:2071): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc3254ebe0 a2=3 a3=0 items=0 ppid=1 pid=12579 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:08:08.915000 audit[12579]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc3254ebe0 a2=3 a3=0 items=0 ppid=1 pid=12579 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:08:09.246648 kernel: audit: type=1327 audit(1707818888.915:2071): proctitle=737368643A20636F7265205B707269765D Feb 13 10:08:08.915000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:08:09.277101 kernel: audit: type=1105 audit(1707818888.919:2072): pid=12579 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:08.919000 audit[12579]: USER_START pid=12579 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:09.371625 kernel: audit: type=1103 audit(1707818888.920:2073): pid=12610 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:08.920000 audit[12610]: CRED_ACQ pid=12610 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:09.460846 kernel: audit: type=1106 audit(1707818888.996:2074): pid=12579 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:08.996000 audit[12579]: USER_END pid=12579 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:09.556373 kernel: audit: type=1104 audit(1707818888.996:2075): pid=12579 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:08.996000 audit[12579]: CRED_DISP pid=12579 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:08.997000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-139.178.70.43:22-139.178.68.195:57456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:09.620000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:08:09.620000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0016f4c00 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:08:09.620000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:08:09.620000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:08:09.620000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00241df40 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:08:09.620000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:08:09.622000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:08:09.622000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002267c20 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:08:09.622000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:08:09.622000 audit[2416]: AVC avc: denied { watch } for pid=2416 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 10:08:09.622000 audit[2416]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c002267c40 a2=fc6 a3=0 items=0 ppid=2282 pid=2416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c80,c952 key=(null) Feb 13 10:08:09.622000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 10:08:09.854586 env[1473]: time="2024-02-13T10:08:09.854299482Z" level=info msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\"" Feb 13 10:08:09.854586 env[1473]: time="2024-02-13T10:08:09.854387251Z" level=info msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\"" Feb 13 10:08:09.881662 env[1473]: time="2024-02-13T10:08:09.881594679Z" level=error msg="StopPodSandbox for \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\" failed" error="failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:08:09.881816 env[1473]: time="2024-02-13T10:08:09.881773150Z" level=error msg="StopPodSandbox for \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\" failed" error="failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:08:09.881904 kubelet[2593]: E0213 10:08:09.881808 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768" Feb 13 10:08:09.881904 kubelet[2593]: E0213 10:08:09.881856 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768} Feb 13 10:08:09.882000 kubelet[2593]: E0213 10:08:09.881914 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:08:09.882000 kubelet[2593]: E0213 10:08:09.881950 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"18384425-4aba-475c-a64f-6bfe3101b275\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b3cffff3a360f2936d990b11df8843c4f852598db1405e843534d028da29d768\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86cd8c4979-2tlsw" podUID=18384425-4aba-475c-a64f-6bfe3101b275 Feb 13 10:08:09.882000 kubelet[2593]: E0213 10:08:09.881969 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2" Feb 13 10:08:09.882000 kubelet[2593]: E0213 10:08:09.881983 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2} Feb 13 10:08:09.882144 kubelet[2593]: E0213 10:08:09.882002 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:08:09.882144 kubelet[2593]: E0213 10:08:09.882016 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fe6819ac-25fb-455a-b6b5-7432acf1219d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de8210eb476cc3a1bc18f359069941242a0a13efa55aa05ded6d4b0b622f45b2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-sv24x" podUID=fe6819ac-25fb-455a-b6b5-7432acf1219d Feb 13 10:08:11.854845 env[1473]: time="2024-02-13T10:08:11.854754842Z" level=info msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\"" Feb 13 10:08:11.906393 env[1473]: time="2024-02-13T10:08:11.906294979Z" level=error msg="StopPodSandbox for \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\" failed" error="failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 10:08:11.906622 kubelet[2593]: E0213 10:08:11.906565 2593 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f" Feb 13 10:08:11.906622 kubelet[2593]: E0213 10:08:11.906608 2593 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f} Feb 13 10:08:11.907066 kubelet[2593]: E0213 10:08:11.906667 2593 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 10:08:11.907066 kubelet[2593]: E0213 10:08:11.906709 2593 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"70a6a2a2-80be-4700-bde4-cdae2bf45250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a84d3a4470e1de22f996d05db84d108b6ceb6cc0d197cf39c38f318c0d0954f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w8xgk" podUID=70a6a2a2-80be-4700-bde4-cdae2bf45250 Feb 13 10:08:14.005515 systemd[1]: Started sshd@93-139.178.70.43:22-139.178.68.195:57464.service. Feb 13 10:08:14.004000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-139.178.70.43:22-139.178.68.195:57464 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:14.032381 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 10:08:14.032461 kernel: audit: type=1130 audit(1707818894.004:2081): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-139.178.70.43:22-139.178.68.195:57464 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 10:08:14.141000 audit[12721]: USER_ACCT pid=12721 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.142191 sshd[12721]: Accepted publickey for core from 139.178.68.195 port 57464 ssh2: RSA SHA256:wM1bdaCPwerSW1mOnJZTsZDRswKX2qe3WXCkDWmUy9w Feb 13 10:08:14.145613 sshd[12721]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 10:08:14.147889 systemd-logind[1461]: New session 96 of user core. Feb 13 10:08:14.148437 systemd[1]: Started session-96.scope. Feb 13 10:08:14.227702 sshd[12721]: pam_unix(sshd:session): session closed for user core Feb 13 10:08:14.229137 systemd[1]: sshd@93-139.178.70.43:22-139.178.68.195:57464.service: Deactivated successfully. Feb 13 10:08:14.229641 systemd[1]: session-96.scope: Deactivated successfully. Feb 13 10:08:14.230013 systemd-logind[1461]: Session 96 logged out. Waiting for processes to exit. Feb 13 10:08:14.230566 systemd-logind[1461]: Removed session 96. Feb 13 10:08:14.144000 audit[12721]: CRED_ACQ pid=12721 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.326960 kernel: audit: type=1101 audit(1707818894.141:2082): pid=12721 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.326997 kernel: audit: type=1103 audit(1707818894.144:2083): pid=12721 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.327011 kernel: audit: type=1006 audit(1707818894.144:2084): pid=12721 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=96 res=1 Feb 13 10:08:14.385568 kernel: audit: type=1300 audit(1707818894.144:2084): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4a58cb10 a2=3 a3=0 items=0 ppid=1 pid=12721 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:08:14.144000 audit[12721]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4a58cb10 a2=3 a3=0 items=0 ppid=1 pid=12721 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 10:08:14.477724 kernel: audit: type=1327 audit(1707818894.144:2084): proctitle=737368643A20636F7265205B707269765D Feb 13 10:08:14.144000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 10:08:14.508207 kernel: audit: type=1105 audit(1707818894.149:2085): pid=12721 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.149000 audit[12721]: USER_START pid=12721 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.602658 kernel: audit: type=1103 audit(1707818894.150:2086): pid=12723 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.150000 audit[12723]: CRED_ACQ pid=12723 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.691892 kernel: audit: type=1106 audit(1707818894.227:2087): pid=12721 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.227000 audit[12721]: USER_END pid=12721 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.787423 kernel: audit: type=1104 audit(1707818894.227:2088): pid=12721 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.227000 audit[12721]: CRED_DISP pid=12721 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 10:08:14.227000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-139.178.70.43:22-139.178.68.195:57464 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'