Feb 9 20:31:35.546545 kernel: Linux version 5.15.148-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP Fri Feb 9 17:23:38 -00 2024 Feb 9 20:31:35.546557 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=4dbf910aaff679d18007a871aba359cc2cf6cb85992bb7598afad40271debbd6 Feb 9 20:31:35.546564 kernel: BIOS-provided physical RAM map: Feb 9 20:31:35.546568 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Feb 9 20:31:35.546572 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Feb 9 20:31:35.546575 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Feb 9 20:31:35.546580 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Feb 9 20:31:35.546584 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Feb 9 20:31:35.546587 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000081f7ffff] usable Feb 9 20:31:35.546591 kernel: BIOS-e820: [mem 0x0000000081f80000-0x0000000081f80fff] ACPI NVS Feb 9 20:31:35.546595 kernel: BIOS-e820: [mem 0x0000000081f81000-0x0000000081f81fff] reserved Feb 9 20:31:35.546599 kernel: BIOS-e820: [mem 0x0000000081f82000-0x000000008afccfff] usable Feb 9 20:31:35.546603 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Feb 9 20:31:35.546607 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Feb 9 20:31:35.546612 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Feb 9 20:31:35.546617 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Feb 9 20:31:35.546621 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Feb 9 20:31:35.546625 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Feb 9 20:31:35.546629 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Feb 9 20:31:35.546633 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Feb 9 20:31:35.546637 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Feb 9 20:31:35.546641 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 9 20:31:35.546645 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Feb 9 20:31:35.546649 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Feb 9 20:31:35.546653 kernel: NX (Execute Disable) protection: active Feb 9 20:31:35.546657 kernel: SMBIOS 3.2.1 present. Feb 9 20:31:35.546662 kernel: DMI: Supermicro SYS-5019C-MR/X11SCM-F, BIOS 1.9 09/16/2022 Feb 9 20:31:35.546666 kernel: tsc: Detected 3400.000 MHz processor Feb 9 20:31:35.546670 kernel: tsc: Detected 3399.906 MHz TSC Feb 9 20:31:35.546674 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 9 20:31:35.546679 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 9 20:31:35.546683 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Feb 9 20:31:35.546687 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 9 20:31:35.546692 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Feb 9 20:31:35.546696 kernel: Using GB pages for direct mapping Feb 9 20:31:35.546700 kernel: ACPI: Early table checksum verification disabled Feb 9 20:31:35.546705 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Feb 9 20:31:35.546709 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Feb 9 20:31:35.546713 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Feb 9 20:31:35.546718 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Feb 9 20:31:35.546724 kernel: ACPI: FACS 0x000000008C66CF80 000040 Feb 9 20:31:35.546728 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Feb 9 20:31:35.546733 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Feb 9 20:31:35.546738 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Feb 9 20:31:35.546743 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Feb 9 20:31:35.546747 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Feb 9 20:31:35.546752 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Feb 9 20:31:35.546756 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Feb 9 20:31:35.546761 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Feb 9 20:31:35.546765 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 9 20:31:35.546771 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Feb 9 20:31:35.546775 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Feb 9 20:31:35.546780 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 9 20:31:35.546784 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 9 20:31:35.546789 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Feb 9 20:31:35.546793 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Feb 9 20:31:35.546798 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 9 20:31:35.546802 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Feb 9 20:31:35.546808 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Feb 9 20:31:35.546812 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Feb 9 20:31:35.546817 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Feb 9 20:31:35.546821 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Feb 9 20:31:35.546826 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Feb 9 20:31:35.546830 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Feb 9 20:31:35.546835 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Feb 9 20:31:35.546839 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Feb 9 20:31:35.546844 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Feb 9 20:31:35.546849 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Feb 9 20:31:35.546854 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Feb 9 20:31:35.546858 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Feb 9 20:31:35.546863 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Feb 9 20:31:35.546867 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Feb 9 20:31:35.546872 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Feb 9 20:31:35.546876 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Feb 9 20:31:35.546881 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Feb 9 20:31:35.546886 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Feb 9 20:31:35.546891 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Feb 9 20:31:35.546895 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Feb 9 20:31:35.546900 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Feb 9 20:31:35.546904 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Feb 9 20:31:35.546909 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Feb 9 20:31:35.546913 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Feb 9 20:31:35.546918 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Feb 9 20:31:35.546922 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Feb 9 20:31:35.546927 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Feb 9 20:31:35.546932 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Feb 9 20:31:35.546936 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Feb 9 20:31:35.546941 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Feb 9 20:31:35.546945 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Feb 9 20:31:35.546950 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Feb 9 20:31:35.546954 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Feb 9 20:31:35.546959 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Feb 9 20:31:35.546963 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Feb 9 20:31:35.546969 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Feb 9 20:31:35.546973 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Feb 9 20:31:35.546978 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Feb 9 20:31:35.546982 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Feb 9 20:31:35.546987 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Feb 9 20:31:35.546991 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Feb 9 20:31:35.546996 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Feb 9 20:31:35.547000 kernel: No NUMA configuration found Feb 9 20:31:35.547005 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Feb 9 20:31:35.547010 kernel: NODE_DATA(0) allocated [mem 0x86effa000-0x86effffff] Feb 9 20:31:35.547015 kernel: Zone ranges: Feb 9 20:31:35.547019 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 9 20:31:35.547024 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Feb 9 20:31:35.547028 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Feb 9 20:31:35.547033 kernel: Movable zone start for each node Feb 9 20:31:35.547037 kernel: Early memory node ranges Feb 9 20:31:35.547042 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Feb 9 20:31:35.547046 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Feb 9 20:31:35.547051 kernel: node 0: [mem 0x0000000040400000-0x0000000081f7ffff] Feb 9 20:31:35.547056 kernel: node 0: [mem 0x0000000081f82000-0x000000008afccfff] Feb 9 20:31:35.547061 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Feb 9 20:31:35.547065 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Feb 9 20:31:35.547070 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Feb 9 20:31:35.547074 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Feb 9 20:31:35.547079 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 9 20:31:35.547087 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Feb 9 20:31:35.547092 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Feb 9 20:31:35.547097 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Feb 9 20:31:35.547102 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Feb 9 20:31:35.547108 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Feb 9 20:31:35.547113 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Feb 9 20:31:35.547118 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Feb 9 20:31:35.547123 kernel: ACPI: PM-Timer IO Port: 0x1808 Feb 9 20:31:35.547127 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 9 20:31:35.547132 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 9 20:31:35.547137 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 9 20:31:35.547143 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 9 20:31:35.547147 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 9 20:31:35.547152 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 9 20:31:35.547157 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 9 20:31:35.547162 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 9 20:31:35.547167 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 9 20:31:35.547171 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 9 20:31:35.547176 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 9 20:31:35.547181 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 9 20:31:35.547186 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 9 20:31:35.547191 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 9 20:31:35.547196 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 9 20:31:35.547201 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 9 20:31:35.547206 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Feb 9 20:31:35.547211 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 9 20:31:35.547215 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 9 20:31:35.547220 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 9 20:31:35.547225 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Feb 9 20:31:35.547231 kernel: TSC deadline timer available Feb 9 20:31:35.547235 kernel: smpboot: Allowing 16 CPUs, 0 hotplug CPUs Feb 9 20:31:35.547240 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Feb 9 20:31:35.547245 kernel: Booting paravirtualized kernel on bare hardware Feb 9 20:31:35.547250 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 9 20:31:35.547255 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:512 nr_cpu_ids:16 nr_node_ids:1 Feb 9 20:31:35.547260 kernel: percpu: Embedded 55 pages/cpu s185624 r8192 d31464 u262144 Feb 9 20:31:35.547265 kernel: pcpu-alloc: s185624 r8192 d31464 u262144 alloc=1*2097152 Feb 9 20:31:35.547270 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Feb 9 20:31:35.547275 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8232415 Feb 9 20:31:35.547280 kernel: Policy zone: Normal Feb 9 20:31:35.547286 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=4dbf910aaff679d18007a871aba359cc2cf6cb85992bb7598afad40271debbd6 Feb 9 20:31:35.547291 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 9 20:31:35.547296 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Feb 9 20:31:35.547300 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Feb 9 20:31:35.547305 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 9 20:31:35.547310 kernel: Memory: 32724720K/33452980K available (12294K kernel code, 2275K rwdata, 13700K rodata, 45496K init, 4048K bss, 728000K reserved, 0K cma-reserved) Feb 9 20:31:35.547316 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Feb 9 20:31:35.547321 kernel: ftrace: allocating 34475 entries in 135 pages Feb 9 20:31:35.547326 kernel: ftrace: allocated 135 pages with 4 groups Feb 9 20:31:35.547331 kernel: rcu: Hierarchical RCU implementation. Feb 9 20:31:35.547336 kernel: rcu: RCU event tracing is enabled. Feb 9 20:31:35.547343 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Feb 9 20:31:35.547366 kernel: Rude variant of Tasks RCU enabled. Feb 9 20:31:35.547371 kernel: Tracing variant of Tasks RCU enabled. Feb 9 20:31:35.547376 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 9 20:31:35.547382 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Feb 9 20:31:35.547401 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Feb 9 20:31:35.547406 kernel: random: crng init done Feb 9 20:31:35.547410 kernel: Console: colour dummy device 80x25 Feb 9 20:31:35.547415 kernel: printk: console [tty0] enabled Feb 9 20:31:35.547420 kernel: printk: console [ttyS1] enabled Feb 9 20:31:35.547425 kernel: ACPI: Core revision 20210730 Feb 9 20:31:35.547430 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Feb 9 20:31:35.547434 kernel: APIC: Switch to symmetric I/O mode setup Feb 9 20:31:35.547440 kernel: DMAR: Host address width 39 Feb 9 20:31:35.547445 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Feb 9 20:31:35.547450 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Feb 9 20:31:35.547455 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Feb 9 20:31:35.547459 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Feb 9 20:31:35.547464 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Feb 9 20:31:35.547469 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Feb 9 20:31:35.547474 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Feb 9 20:31:35.547479 kernel: x2apic enabled Feb 9 20:31:35.547484 kernel: Switched APIC routing to cluster x2apic. Feb 9 20:31:35.547489 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Feb 9 20:31:35.547494 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Feb 9 20:31:35.547499 kernel: CPU0: Thermal monitoring enabled (TM1) Feb 9 20:31:35.547504 kernel: process: using mwait in idle threads Feb 9 20:31:35.547508 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 9 20:31:35.547513 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 9 20:31:35.547518 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 9 20:31:35.547522 kernel: Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks! Feb 9 20:31:35.547528 kernel: Spectre V2 : Mitigation: Enhanced IBRS Feb 9 20:31:35.547533 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 9 20:31:35.547538 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 9 20:31:35.547542 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 9 20:31:35.547547 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 9 20:31:35.547552 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp Feb 9 20:31:35.547557 kernel: TAA: Mitigation: TSX disabled Feb 9 20:31:35.547561 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Feb 9 20:31:35.547566 kernel: SRBDS: Mitigation: Microcode Feb 9 20:31:35.547571 kernel: GDS: Vulnerable: No microcode Feb 9 20:31:35.547576 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 9 20:31:35.547581 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 9 20:31:35.547586 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 9 20:31:35.547591 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Feb 9 20:31:35.547596 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Feb 9 20:31:35.547600 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 9 20:31:35.547605 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Feb 9 20:31:35.547610 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Feb 9 20:31:35.547615 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Feb 9 20:31:35.547619 kernel: Freeing SMP alternatives memory: 32K Feb 9 20:31:35.547624 kernel: pid_max: default: 32768 minimum: 301 Feb 9 20:31:35.547629 kernel: LSM: Security Framework initializing Feb 9 20:31:35.547633 kernel: SELinux: Initializing. Feb 9 20:31:35.547639 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 9 20:31:35.547644 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 9 20:31:35.547649 kernel: smpboot: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Feb 9 20:31:35.547653 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 9 20:31:35.547658 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Feb 9 20:31:35.547663 kernel: ... version: 4 Feb 9 20:31:35.547668 kernel: ... bit width: 48 Feb 9 20:31:35.547673 kernel: ... generic registers: 4 Feb 9 20:31:35.547678 kernel: ... value mask: 0000ffffffffffff Feb 9 20:31:35.547682 kernel: ... max period: 00007fffffffffff Feb 9 20:31:35.547688 kernel: ... fixed-purpose events: 3 Feb 9 20:31:35.547693 kernel: ... event mask: 000000070000000f Feb 9 20:31:35.547697 kernel: signal: max sigframe size: 2032 Feb 9 20:31:35.547702 kernel: rcu: Hierarchical SRCU implementation. Feb 9 20:31:35.547707 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Feb 9 20:31:35.547712 kernel: smp: Bringing up secondary CPUs ... Feb 9 20:31:35.547717 kernel: x86: Booting SMP configuration: Feb 9 20:31:35.547722 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 Feb 9 20:31:35.547727 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Feb 9 20:31:35.547732 kernel: #9 #10 #11 #12 #13 #14 #15 Feb 9 20:31:35.547737 kernel: smp: Brought up 1 node, 16 CPUs Feb 9 20:31:35.547742 kernel: smpboot: Max logical packages: 1 Feb 9 20:31:35.547747 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Feb 9 20:31:35.547752 kernel: devtmpfs: initialized Feb 9 20:31:35.547756 kernel: x86/mm: Memory block size: 128MB Feb 9 20:31:35.547761 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x81f80000-0x81f80fff] (4096 bytes) Feb 9 20:31:35.547766 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Feb 9 20:31:35.547772 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 9 20:31:35.547777 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Feb 9 20:31:35.547782 kernel: pinctrl core: initialized pinctrl subsystem Feb 9 20:31:35.547787 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 9 20:31:35.547791 kernel: audit: initializing netlink subsys (disabled) Feb 9 20:31:35.547796 kernel: audit: type=2000 audit(1707510690.040:1): state=initialized audit_enabled=0 res=1 Feb 9 20:31:35.547801 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 9 20:31:35.547806 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 9 20:31:35.547811 kernel: cpuidle: using governor menu Feb 9 20:31:35.547816 kernel: ACPI: bus type PCI registered Feb 9 20:31:35.547821 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 9 20:31:35.547826 kernel: dca service started, version 1.12.1 Feb 9 20:31:35.547831 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Feb 9 20:31:35.547836 kernel: PCI: MMCONFIG at [mem 0xe0000000-0xefffffff] reserved in E820 Feb 9 20:31:35.547841 kernel: PCI: Using configuration type 1 for base access Feb 9 20:31:35.547845 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Feb 9 20:31:35.547850 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 9 20:31:35.547855 kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Feb 9 20:31:35.547860 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Feb 9 20:31:35.547865 kernel: ACPI: Added _OSI(Module Device) Feb 9 20:31:35.547870 kernel: ACPI: Added _OSI(Processor Device) Feb 9 20:31:35.547875 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 9 20:31:35.547880 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 9 20:31:35.547885 kernel: ACPI: Added _OSI(Linux-Dell-Video) Feb 9 20:31:35.547889 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Feb 9 20:31:35.547894 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Feb 9 20:31:35.547899 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Feb 9 20:31:35.547905 kernel: ACPI: Dynamic OEM Table Load: Feb 9 20:31:35.547909 kernel: ACPI: SSDT 0xFFFF9F1BC0212300 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Feb 9 20:31:35.547915 kernel: ACPI: \_SB_.PR00: _OSC native thermal LVT Acked Feb 9 20:31:35.547919 kernel: ACPI: Dynamic OEM Table Load: Feb 9 20:31:35.547924 kernel: ACPI: SSDT 0xFFFF9F1BC1AE2000 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Feb 9 20:31:35.547929 kernel: ACPI: Dynamic OEM Table Load: Feb 9 20:31:35.547934 kernel: ACPI: SSDT 0xFFFF9F1BC1A5D000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Feb 9 20:31:35.547938 kernel: ACPI: Dynamic OEM Table Load: Feb 9 20:31:35.547943 kernel: ACPI: SSDT 0xFFFF9F1BC1A5A000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Feb 9 20:31:35.547948 kernel: ACPI: Dynamic OEM Table Load: Feb 9 20:31:35.547953 kernel: ACPI: SSDT 0xFFFF9F1BC0149000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Feb 9 20:31:35.547958 kernel: ACPI: Dynamic OEM Table Load: Feb 9 20:31:35.547963 kernel: ACPI: SSDT 0xFFFF9F1BC1AE0C00 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Feb 9 20:31:35.547968 kernel: ACPI: Interpreter enabled Feb 9 20:31:35.547973 kernel: ACPI: PM: (supports S0 S5) Feb 9 20:31:35.547977 kernel: ACPI: Using IOAPIC for interrupt routing Feb 9 20:31:35.547982 kernel: HEST: Enabling Firmware First mode for corrected errors. Feb 9 20:31:35.547987 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Feb 9 20:31:35.547992 kernel: HEST: Table parsing has been initialized. Feb 9 20:31:35.547997 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Feb 9 20:31:35.548002 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 9 20:31:35.548007 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Feb 9 20:31:35.548012 kernel: ACPI: PM: Power Resource [USBC] Feb 9 20:31:35.548017 kernel: ACPI: PM: Power Resource [V0PR] Feb 9 20:31:35.548022 kernel: ACPI: PM: Power Resource [V1PR] Feb 9 20:31:35.548026 kernel: ACPI: PM: Power Resource [V2PR] Feb 9 20:31:35.548031 kernel: ACPI: PM: Power Resource [WRST] Feb 9 20:31:35.548036 kernel: ACPI: PM: Power Resource [FN00] Feb 9 20:31:35.548041 kernel: ACPI: PM: Power Resource [FN01] Feb 9 20:31:35.548046 kernel: ACPI: PM: Power Resource [FN02] Feb 9 20:31:35.548051 kernel: ACPI: PM: Power Resource [FN03] Feb 9 20:31:35.548056 kernel: ACPI: PM: Power Resource [FN04] Feb 9 20:31:35.548060 kernel: ACPI: PM: Power Resource [PIN] Feb 9 20:31:35.548065 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Feb 9 20:31:35.548128 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 9 20:31:35.548173 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Feb 9 20:31:35.548214 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Feb 9 20:31:35.548221 kernel: PCI host bridge to bus 0000:00 Feb 9 20:31:35.548264 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 9 20:31:35.548300 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 9 20:31:35.548336 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 9 20:31:35.548405 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Feb 9 20:31:35.548441 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Feb 9 20:31:35.548477 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Feb 9 20:31:35.548527 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 Feb 9 20:31:35.548575 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 Feb 9 20:31:35.548617 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Feb 9 20:31:35.548663 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 Feb 9 20:31:35.548704 kernel: pci 0000:00:08.0: reg 0x10: [mem 0x9551f000-0x9551ffff 64bit] Feb 9 20:31:35.548750 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 Feb 9 20:31:35.548792 kernel: pci 0000:00:12.0: reg 0x10: [mem 0x9551e000-0x9551efff 64bit] Feb 9 20:31:35.548839 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 Feb 9 20:31:35.548880 kernel: pci 0000:00:14.0: reg 0x10: [mem 0x95500000-0x9550ffff 64bit] Feb 9 20:31:35.548923 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Feb 9 20:31:35.548967 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 Feb 9 20:31:35.549009 kernel: pci 0000:00:14.2: reg 0x10: [mem 0x95512000-0x95513fff 64bit] Feb 9 20:31:35.549050 kernel: pci 0000:00:14.2: reg 0x18: [mem 0x9551d000-0x9551dfff 64bit] Feb 9 20:31:35.549094 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 Feb 9 20:31:35.549135 kernel: pci 0000:00:15.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 9 20:31:35.549181 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 Feb 9 20:31:35.549222 kernel: pci 0000:00:15.1: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 9 20:31:35.549266 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 Feb 9 20:31:35.549308 kernel: pci 0000:00:16.0: reg 0x10: [mem 0x9551a000-0x9551afff 64bit] Feb 9 20:31:35.549367 kernel: pci 0000:00:16.0: PME# supported from D3hot Feb 9 20:31:35.549424 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 Feb 9 20:31:35.549465 kernel: pci 0000:00:16.1: reg 0x10: [mem 0x95519000-0x95519fff 64bit] Feb 9 20:31:35.549505 kernel: pci 0000:00:16.1: PME# supported from D3hot Feb 9 20:31:35.549549 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 Feb 9 20:31:35.549591 kernel: pci 0000:00:16.4: reg 0x10: [mem 0x95518000-0x95518fff 64bit] Feb 9 20:31:35.549631 kernel: pci 0000:00:16.4: PME# supported from D3hot Feb 9 20:31:35.549674 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 Feb 9 20:31:35.549714 kernel: pci 0000:00:17.0: reg 0x10: [mem 0x95510000-0x95511fff] Feb 9 20:31:35.549754 kernel: pci 0000:00:17.0: reg 0x14: [mem 0x95517000-0x955170ff] Feb 9 20:31:35.549792 kernel: pci 0000:00:17.0: reg 0x18: [io 0x6050-0x6057] Feb 9 20:31:35.549833 kernel: pci 0000:00:17.0: reg 0x1c: [io 0x6040-0x6043] Feb 9 20:31:35.549879 kernel: pci 0000:00:17.0: reg 0x20: [io 0x6020-0x603f] Feb 9 20:31:35.549922 kernel: pci 0000:00:17.0: reg 0x24: [mem 0x95516000-0x955167ff] Feb 9 20:31:35.549962 kernel: pci 0000:00:17.0: PME# supported from D3hot Feb 9 20:31:35.550005 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 Feb 9 20:31:35.550046 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Feb 9 20:31:35.550090 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 Feb 9 20:31:35.550132 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Feb 9 20:31:35.550178 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 Feb 9 20:31:35.550220 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Feb 9 20:31:35.550265 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 Feb 9 20:31:35.550307 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Feb 9 20:31:35.550370 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 Feb 9 20:31:35.550427 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Feb 9 20:31:35.550471 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 Feb 9 20:31:35.550512 kernel: pci 0000:00:1e.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 9 20:31:35.550558 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 Feb 9 20:31:35.550603 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 Feb 9 20:31:35.550644 kernel: pci 0000:00:1f.4: reg 0x10: [mem 0x95514000-0x955140ff 64bit] Feb 9 20:31:35.550684 kernel: pci 0000:00:1f.4: reg 0x20: [io 0xefa0-0xefbf] Feb 9 20:31:35.550729 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 Feb 9 20:31:35.550771 kernel: pci 0000:00:1f.5: reg 0x10: [mem 0xfe010000-0xfe010fff] Feb 9 20:31:35.550816 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 Feb 9 20:31:35.550859 kernel: pci 0000:01:00.0: reg 0x10: [mem 0x92000000-0x93ffffff 64bit pref] Feb 9 20:31:35.550903 kernel: pci 0000:01:00.0: reg 0x30: [mem 0x95200000-0x952fffff pref] Feb 9 20:31:35.550946 kernel: pci 0000:01:00.0: PME# supported from D3cold Feb 9 20:31:35.550988 kernel: pci 0000:01:00.0: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 9 20:31:35.551030 kernel: pci 0000:01:00.0: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 9 20:31:35.551076 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 Feb 9 20:31:35.551120 kernel: pci 0000:01:00.1: reg 0x10: [mem 0x90000000-0x91ffffff 64bit pref] Feb 9 20:31:35.551164 kernel: pci 0000:01:00.1: reg 0x30: [mem 0x95100000-0x951fffff pref] Feb 9 20:31:35.551206 kernel: pci 0000:01:00.1: PME# supported from D3cold Feb 9 20:31:35.551248 kernel: pci 0000:01:00.1: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 9 20:31:35.551289 kernel: pci 0000:01:00.1: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 9 20:31:35.551331 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 9 20:31:35.551373 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 9 20:31:35.551416 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 9 20:31:35.551456 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 9 20:31:35.551505 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 Feb 9 20:31:35.551547 kernel: pci 0000:03:00.0: reg 0x10: [mem 0x95400000-0x9547ffff] Feb 9 20:31:35.551590 kernel: pci 0000:03:00.0: reg 0x18: [io 0x5000-0x501f] Feb 9 20:31:35.551631 kernel: pci 0000:03:00.0: reg 0x1c: [mem 0x95480000-0x95483fff] Feb 9 20:31:35.551673 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 9 20:31:35.551713 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 9 20:31:35.551753 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 9 20:31:35.551794 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 9 20:31:35.551841 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 Feb 9 20:31:35.551884 kernel: pci 0000:04:00.0: reg 0x10: [mem 0x95300000-0x9537ffff] Feb 9 20:31:35.551926 kernel: pci 0000:04:00.0: reg 0x18: [io 0x4000-0x401f] Feb 9 20:31:35.551968 kernel: pci 0000:04:00.0: reg 0x1c: [mem 0x95380000-0x95383fff] Feb 9 20:31:35.552041 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Feb 9 20:31:35.552104 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 9 20:31:35.552144 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 9 20:31:35.552188 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 9 20:31:35.552227 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 9 20:31:35.552275 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 Feb 9 20:31:35.552319 kernel: pci 0000:06:00.0: enabling Extended Tags Feb 9 20:31:35.552398 kernel: pci 0000:06:00.0: supports D1 D2 Feb 9 20:31:35.552442 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 9 20:31:35.552482 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 9 20:31:35.552524 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 9 20:31:35.552566 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 9 20:31:35.552614 kernel: pci_bus 0000:07: extended config space not accessible Feb 9 20:31:35.552661 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 Feb 9 20:31:35.552706 kernel: pci 0000:07:00.0: reg 0x10: [mem 0x94000000-0x94ffffff] Feb 9 20:31:35.552751 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x95000000-0x9501ffff] Feb 9 20:31:35.552796 kernel: pci 0000:07:00.0: reg 0x18: [io 0x3000-0x307f] Feb 9 20:31:35.552839 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 9 20:31:35.552885 kernel: pci 0000:07:00.0: supports D1 D2 Feb 9 20:31:35.552929 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 9 20:31:35.552973 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 9 20:31:35.553015 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 9 20:31:35.553057 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 9 20:31:35.553065 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Feb 9 20:31:35.553070 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Feb 9 20:31:35.553076 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Feb 9 20:31:35.553082 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Feb 9 20:31:35.553087 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Feb 9 20:31:35.553092 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Feb 9 20:31:35.553097 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Feb 9 20:31:35.553103 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Feb 9 20:31:35.553108 kernel: iommu: Default domain type: Translated Feb 9 20:31:35.553113 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 9 20:31:35.553156 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Feb 9 20:31:35.553202 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 9 20:31:35.553245 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Feb 9 20:31:35.553253 kernel: vgaarb: loaded Feb 9 20:31:35.553258 kernel: pps_core: LinuxPPS API ver. 1 registered Feb 9 20:31:35.553263 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Feb 9 20:31:35.553269 kernel: PTP clock support registered Feb 9 20:31:35.553274 kernel: PCI: Using ACPI for IRQ routing Feb 9 20:31:35.553279 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 9 20:31:35.553284 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Feb 9 20:31:35.553290 kernel: e820: reserve RAM buffer [mem 0x81f80000-0x83ffffff] Feb 9 20:31:35.553295 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Feb 9 20:31:35.553300 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Feb 9 20:31:35.553305 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Feb 9 20:31:35.553310 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Feb 9 20:31:35.553315 kernel: clocksource: Switched to clocksource tsc-early Feb 9 20:31:35.553321 kernel: VFS: Disk quotas dquot_6.6.0 Feb 9 20:31:35.553326 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 9 20:31:35.553331 kernel: pnp: PnP ACPI init Feb 9 20:31:35.553408 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Feb 9 20:31:35.553449 kernel: pnp 00:02: [dma 0 disabled] Feb 9 20:31:35.553490 kernel: pnp 00:03: [dma 0 disabled] Feb 9 20:31:35.553532 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Feb 9 20:31:35.553569 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Feb 9 20:31:35.553609 kernel: system 00:05: [io 0x1854-0x1857] has been reserved Feb 9 20:31:35.553651 kernel: system 00:06: [mem 0xfed10000-0xfed17fff] has been reserved Feb 9 20:31:35.553688 kernel: system 00:06: [mem 0xfed18000-0xfed18fff] has been reserved Feb 9 20:31:35.553724 kernel: system 00:06: [mem 0xfed19000-0xfed19fff] has been reserved Feb 9 20:31:35.553760 kernel: system 00:06: [mem 0xe0000000-0xefffffff] has been reserved Feb 9 20:31:35.553797 kernel: system 00:06: [mem 0xfed20000-0xfed3ffff] has been reserved Feb 9 20:31:35.553832 kernel: system 00:06: [mem 0xfed90000-0xfed93fff] could not be reserved Feb 9 20:31:35.553869 kernel: system 00:06: [mem 0xfed45000-0xfed8ffff] has been reserved Feb 9 20:31:35.553907 kernel: system 00:06: [mem 0xfee00000-0xfeefffff] could not be reserved Feb 9 20:31:35.553947 kernel: system 00:07: [io 0x1800-0x18fe] could not be reserved Feb 9 20:31:35.553983 kernel: system 00:07: [mem 0xfd000000-0xfd69ffff] has been reserved Feb 9 20:31:35.554020 kernel: system 00:07: [mem 0xfd6c0000-0xfd6cffff] has been reserved Feb 9 20:31:35.554056 kernel: system 00:07: [mem 0xfd6f0000-0xfdffffff] has been reserved Feb 9 20:31:35.554093 kernel: system 00:07: [mem 0xfe000000-0xfe01ffff] could not be reserved Feb 9 20:31:35.554130 kernel: system 00:07: [mem 0xfe200000-0xfe7fffff] has been reserved Feb 9 20:31:35.554167 kernel: system 00:07: [mem 0xff000000-0xffffffff] has been reserved Feb 9 20:31:35.554208 kernel: system 00:08: [io 0x2000-0x20fe] has been reserved Feb 9 20:31:35.554216 kernel: pnp: PnP ACPI: found 10 devices Feb 9 20:31:35.554222 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 9 20:31:35.554227 kernel: NET: Registered PF_INET protocol family Feb 9 20:31:35.554232 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 9 20:31:35.554237 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Feb 9 20:31:35.554243 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 9 20:31:35.554249 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 9 20:31:35.554255 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Feb 9 20:31:35.554260 kernel: TCP: Hash tables configured (established 262144 bind 65536) Feb 9 20:31:35.554265 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 9 20:31:35.554270 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 9 20:31:35.554276 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 9 20:31:35.554281 kernel: NET: Registered PF_XDP protocol family Feb 9 20:31:35.554322 kernel: pci 0000:00:15.0: BAR 0: assigned [mem 0x95515000-0x95515fff 64bit] Feb 9 20:31:35.554406 kernel: pci 0000:00:15.1: BAR 0: assigned [mem 0x9551b000-0x9551bfff 64bit] Feb 9 20:31:35.554448 kernel: pci 0000:00:1e.0: BAR 0: assigned [mem 0x9551c000-0x9551cfff 64bit] Feb 9 20:31:35.554490 kernel: pci 0000:01:00.0: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 9 20:31:35.554533 kernel: pci 0000:01:00.0: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 9 20:31:35.554575 kernel: pci 0000:01:00.1: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 9 20:31:35.554618 kernel: pci 0000:01:00.1: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 9 20:31:35.554659 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 9 20:31:35.554699 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 9 20:31:35.554743 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 9 20:31:35.554783 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 9 20:31:35.554824 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 9 20:31:35.554865 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 9 20:31:35.554906 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 9 20:31:35.554948 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 9 20:31:35.554989 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 9 20:31:35.555030 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 9 20:31:35.555071 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 9 20:31:35.555114 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 9 20:31:35.555156 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 9 20:31:35.555198 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 9 20:31:35.555239 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 9 20:31:35.555281 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 9 20:31:35.555323 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 9 20:31:35.555401 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Feb 9 20:31:35.555437 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 9 20:31:35.555472 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 9 20:31:35.555508 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 9 20:31:35.555543 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Feb 9 20:31:35.555579 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Feb 9 20:31:35.555619 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Feb 9 20:31:35.555660 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Feb 9 20:31:35.555703 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Feb 9 20:31:35.555741 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Feb 9 20:31:35.555782 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Feb 9 20:31:35.555819 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Feb 9 20:31:35.555861 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Feb 9 20:31:35.555900 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Feb 9 20:31:35.555941 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Feb 9 20:31:35.555980 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Feb 9 20:31:35.555988 kernel: PCI: CLS 64 bytes, default 64 Feb 9 20:31:35.555993 kernel: DMAR: No ATSR found Feb 9 20:31:35.555999 kernel: DMAR: No SATC found Feb 9 20:31:35.556005 kernel: DMAR: dmar0: Using Queued invalidation Feb 9 20:31:35.556045 kernel: pci 0000:00:00.0: Adding to iommu group 0 Feb 9 20:31:35.556089 kernel: pci 0000:00:01.0: Adding to iommu group 1 Feb 9 20:31:35.556131 kernel: pci 0000:00:08.0: Adding to iommu group 2 Feb 9 20:31:35.556172 kernel: pci 0000:00:12.0: Adding to iommu group 3 Feb 9 20:31:35.556212 kernel: pci 0000:00:14.0: Adding to iommu group 4 Feb 9 20:31:35.556252 kernel: pci 0000:00:14.2: Adding to iommu group 4 Feb 9 20:31:35.556292 kernel: pci 0000:00:15.0: Adding to iommu group 5 Feb 9 20:31:35.556332 kernel: pci 0000:00:15.1: Adding to iommu group 5 Feb 9 20:31:35.556413 kernel: pci 0000:00:16.0: Adding to iommu group 6 Feb 9 20:31:35.556455 kernel: pci 0000:00:16.1: Adding to iommu group 6 Feb 9 20:31:35.556496 kernel: pci 0000:00:16.4: Adding to iommu group 6 Feb 9 20:31:35.556536 kernel: pci 0000:00:17.0: Adding to iommu group 7 Feb 9 20:31:35.556577 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Feb 9 20:31:35.556617 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Feb 9 20:31:35.556659 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Feb 9 20:31:35.556700 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Feb 9 20:31:35.556740 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Feb 9 20:31:35.556782 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Feb 9 20:31:35.556823 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Feb 9 20:31:35.556864 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Feb 9 20:31:35.556905 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Feb 9 20:31:35.556947 kernel: pci 0000:01:00.0: Adding to iommu group 1 Feb 9 20:31:35.556990 kernel: pci 0000:01:00.1: Adding to iommu group 1 Feb 9 20:31:35.557032 kernel: pci 0000:03:00.0: Adding to iommu group 15 Feb 9 20:31:35.557074 kernel: pci 0000:04:00.0: Adding to iommu group 16 Feb 9 20:31:35.557120 kernel: pci 0000:06:00.0: Adding to iommu group 17 Feb 9 20:31:35.557164 kernel: pci 0000:07:00.0: Adding to iommu group 17 Feb 9 20:31:35.557171 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Feb 9 20:31:35.557177 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 9 20:31:35.557182 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Feb 9 20:31:35.557187 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Feb 9 20:31:35.557193 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Feb 9 20:31:35.557198 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Feb 9 20:31:35.557204 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Feb 9 20:31:35.557247 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Feb 9 20:31:35.557255 kernel: Initialise system trusted keyrings Feb 9 20:31:35.557260 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Feb 9 20:31:35.557265 kernel: Key type asymmetric registered Feb 9 20:31:35.557270 kernel: Asymmetric key parser 'x509' registered Feb 9 20:31:35.557276 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Feb 9 20:31:35.557281 kernel: io scheduler mq-deadline registered Feb 9 20:31:35.557287 kernel: io scheduler kyber registered Feb 9 20:31:35.557292 kernel: io scheduler bfq registered Feb 9 20:31:35.557334 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Feb 9 20:31:35.557378 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Feb 9 20:31:35.557419 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Feb 9 20:31:35.557460 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Feb 9 20:31:35.557501 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Feb 9 20:31:35.557542 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Feb 9 20:31:35.557589 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Feb 9 20:31:35.557597 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Feb 9 20:31:35.557603 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Feb 9 20:31:35.557608 kernel: pstore: Registered erst as persistent store backend Feb 9 20:31:35.557613 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 9 20:31:35.557619 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 9 20:31:35.557624 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 9 20:31:35.557629 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Feb 9 20:31:35.557635 kernel: hpet_acpi_add: no address or irqs in _CRS Feb 9 20:31:35.557679 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Feb 9 20:31:35.557687 kernel: i8042: PNP: No PS/2 controller found. Feb 9 20:31:35.557723 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Feb 9 20:31:35.557762 kernel: rtc_cmos rtc_cmos: registered as rtc0 Feb 9 20:31:35.557799 kernel: rtc_cmos rtc_cmos: setting system clock to 2024-02-09T20:31:34 UTC (1707510694) Feb 9 20:31:35.557836 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Feb 9 20:31:35.557843 kernel: fail to initialize ptp_kvm Feb 9 20:31:35.557850 kernel: intel_pstate: Intel P-state driver initializing Feb 9 20:31:35.557855 kernel: intel_pstate: Disabling energy efficiency optimization Feb 9 20:31:35.557860 kernel: intel_pstate: HWP enabled Feb 9 20:31:35.557865 kernel: vesafb: mode is 1024x768x8, linelength=1024, pages=0 Feb 9 20:31:35.557871 kernel: vesafb: scrolling: redraw Feb 9 20:31:35.557876 kernel: vesafb: Pseudocolor: size=0:8:8:8, shift=0:0:0:0 Feb 9 20:31:35.557881 kernel: vesafb: framebuffer at 0x94000000, mapped to 0x00000000cb63dd5a, using 768k, total 768k Feb 9 20:31:35.557886 kernel: Console: switching to colour frame buffer device 128x48 Feb 9 20:31:35.557891 kernel: fb0: VESA VGA frame buffer device Feb 9 20:31:35.557898 kernel: NET: Registered PF_INET6 protocol family Feb 9 20:31:35.557903 kernel: Segment Routing with IPv6 Feb 9 20:31:35.557908 kernel: In-situ OAM (IOAM) with IPv6 Feb 9 20:31:35.557913 kernel: NET: Registered PF_PACKET protocol family Feb 9 20:31:35.557918 kernel: Key type dns_resolver registered Feb 9 20:31:35.557923 kernel: microcode: sig=0x906ed, pf=0x2, revision=0xf4 Feb 9 20:31:35.557929 kernel: microcode: Microcode Update Driver: v2.2. Feb 9 20:31:35.557934 kernel: IPI shorthand broadcast: enabled Feb 9 20:31:35.557939 kernel: sched_clock: Marking stable (1734578523, 1339175905)->(4493104702, -1419350274) Feb 9 20:31:35.557945 kernel: registered taskstats version 1 Feb 9 20:31:35.557951 kernel: Loading compiled-in X.509 certificates Feb 9 20:31:35.557956 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.148-flatcar: 56154408a02b3bd349a9e9180c9bd837fd1d636a' Feb 9 20:31:35.557961 kernel: Key type .fscrypt registered Feb 9 20:31:35.557966 kernel: Key type fscrypt-provisioning registered Feb 9 20:31:35.557971 kernel: pstore: Using crash dump compression: deflate Feb 9 20:31:35.557976 kernel: ima: Allocated hash algorithm: sha1 Feb 9 20:31:35.557981 kernel: ima: No architecture policies found Feb 9 20:31:35.557987 kernel: Freeing unused kernel image (initmem) memory: 45496K Feb 9 20:31:35.557993 kernel: Write protecting the kernel read-only data: 28672k Feb 9 20:31:35.557998 kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Feb 9 20:31:35.558003 kernel: Freeing unused kernel image (rodata/data gap) memory: 636K Feb 9 20:31:35.558008 kernel: Run /init as init process Feb 9 20:31:35.558013 kernel: with arguments: Feb 9 20:31:35.558019 kernel: /init Feb 9 20:31:35.558024 kernel: with environment: Feb 9 20:31:35.558029 kernel: HOME=/ Feb 9 20:31:35.558034 kernel: TERM=linux Feb 9 20:31:35.558040 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 9 20:31:35.558046 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 9 20:31:35.558053 systemd[1]: Detected architecture x86-64. Feb 9 20:31:35.558058 systemd[1]: Running in initrd. Feb 9 20:31:35.558063 systemd[1]: No hostname configured, using default hostname. Feb 9 20:31:35.558069 systemd[1]: Hostname set to . Feb 9 20:31:35.558074 systemd[1]: Initializing machine ID from random generator. Feb 9 20:31:35.558080 systemd[1]: Queued start job for default target initrd.target. Feb 9 20:31:35.558086 systemd[1]: Started systemd-ask-password-console.path. Feb 9 20:31:35.558091 systemd[1]: Reached target cryptsetup.target. Feb 9 20:31:35.558096 systemd[1]: Reached target paths.target. Feb 9 20:31:35.558101 systemd[1]: Reached target slices.target. Feb 9 20:31:35.558107 systemd[1]: Reached target swap.target. Feb 9 20:31:35.558112 systemd[1]: Reached target timers.target. Feb 9 20:31:35.558117 systemd[1]: Listening on iscsid.socket. Feb 9 20:31:35.558124 systemd[1]: Listening on iscsiuio.socket. Feb 9 20:31:35.558130 systemd[1]: Listening on systemd-journald-audit.socket. Feb 9 20:31:35.558135 systemd[1]: Listening on systemd-journald-dev-log.socket. Feb 9 20:31:35.558140 systemd[1]: Listening on systemd-journald.socket. Feb 9 20:31:35.558146 kernel: tsc: Refined TSC clocksource calibration: 3407.998 MHz Feb 9 20:31:35.558151 systemd[1]: Listening on systemd-networkd.socket. Feb 9 20:31:35.558156 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd208cfc, max_idle_ns: 440795283699 ns Feb 9 20:31:35.558162 kernel: clocksource: Switched to clocksource tsc Feb 9 20:31:35.558168 systemd[1]: Listening on systemd-udevd-control.socket. Feb 9 20:31:35.558173 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 9 20:31:35.558179 systemd[1]: Reached target sockets.target. Feb 9 20:31:35.558184 systemd[1]: Starting kmod-static-nodes.service... Feb 9 20:31:35.558189 systemd[1]: Finished network-cleanup.service. Feb 9 20:31:35.558195 systemd[1]: Starting systemd-fsck-usr.service... Feb 9 20:31:35.558200 systemd[1]: Starting systemd-journald.service... Feb 9 20:31:35.558205 systemd[1]: Starting systemd-modules-load.service... Feb 9 20:31:35.558213 systemd-journald[267]: Journal started Feb 9 20:31:35.558238 systemd-journald[267]: Runtime Journal (/run/log/journal/cf9d32e5fc634c1898386f43dd9038cd) is 8.0M, max 640.1M, 632.1M free. Feb 9 20:31:35.560956 systemd-modules-load[268]: Inserted module 'overlay' Feb 9 20:31:35.566000 audit: BPF prog-id=6 op=LOAD Feb 9 20:31:35.585385 kernel: audit: type=1334 audit(1707510695.566:2): prog-id=6 op=LOAD Feb 9 20:31:35.585415 systemd[1]: Starting systemd-resolved.service... Feb 9 20:31:35.635353 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 9 20:31:35.635383 systemd[1]: Starting systemd-vconsole-setup.service... Feb 9 20:31:35.667350 kernel: Bridge firewalling registered Feb 9 20:31:35.667383 systemd[1]: Started systemd-journald.service. Feb 9 20:31:35.682062 systemd-modules-load[268]: Inserted module 'br_netfilter' Feb 9 20:31:35.732957 kernel: audit: type=1130 audit(1707510695.690:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.690000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.687916 systemd-resolved[271]: Positive Trust Anchors: Feb 9 20:31:35.808678 kernel: SCSI subsystem initialized Feb 9 20:31:35.808690 kernel: audit: type=1130 audit(1707510695.745:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.808700 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 9 20:31:35.745000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.687921 systemd-resolved[271]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 9 20:31:35.910818 kernel: device-mapper: uevent: version 1.0.3 Feb 9 20:31:35.910829 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Feb 9 20:31:35.910864 kernel: audit: type=1130 audit(1707510695.865:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.865000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.687939 systemd-resolved[271]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 9 20:31:35.984544 kernel: audit: type=1130 audit(1707510695.919:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.919000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.689450 systemd-resolved[271]: Defaulting to hostname 'linux'. Feb 9 20:31:35.993000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.691743 systemd[1]: Started systemd-resolved.service. Feb 9 20:31:36.092427 kernel: audit: type=1130 audit(1707510695.993:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:36.092439 kernel: audit: type=1130 audit(1707510696.046:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:36.046000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:35.746511 systemd[1]: Finished kmod-static-nodes.service. Feb 9 20:31:35.866833 systemd[1]: Finished systemd-fsck-usr.service. Feb 9 20:31:35.911267 systemd-modules-load[268]: Inserted module 'dm_multipath' Feb 9 20:31:35.919635 systemd[1]: Finished systemd-modules-load.service. Feb 9 20:31:35.993624 systemd[1]: Finished systemd-vconsole-setup.service. Feb 9 20:31:36.046610 systemd[1]: Reached target nss-lookup.target. Feb 9 20:31:36.100933 systemd[1]: Starting dracut-cmdline-ask.service... Feb 9 20:31:36.121921 systemd[1]: Starting systemd-sysctl.service... Feb 9 20:31:36.122211 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Feb 9 20:31:36.125058 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Feb 9 20:31:36.123000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:36.125815 systemd[1]: Finished systemd-sysctl.service. Feb 9 20:31:36.174396 kernel: audit: type=1130 audit(1707510696.123:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:36.186000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:36.186676 systemd[1]: Finished dracut-cmdline-ask.service. Feb 9 20:31:36.252437 kernel: audit: type=1130 audit(1707510696.186:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:36.243000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:36.243981 systemd[1]: Starting dracut-cmdline.service... Feb 9 20:31:36.266462 dracut-cmdline[295]: dracut-dracut-053 Feb 9 20:31:36.266462 dracut-cmdline[295]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LA Feb 9 20:31:36.266462 dracut-cmdline[295]: BEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=4dbf910aaff679d18007a871aba359cc2cf6cb85992bb7598afad40271debbd6 Feb 9 20:31:36.333416 kernel: Loading iSCSI transport class v2.0-870. Feb 9 20:31:36.333428 kernel: iscsi: registered transport (tcp) Feb 9 20:31:36.381827 kernel: iscsi: registered transport (qla4xxx) Feb 9 20:31:36.381848 kernel: QLogic iSCSI HBA Driver Feb 9 20:31:36.398353 systemd[1]: Finished dracut-cmdline.service. Feb 9 20:31:36.407000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:36.408091 systemd[1]: Starting dracut-pre-udev.service... Feb 9 20:31:36.465404 kernel: raid6: avx2x4 gen() 38966 MB/s Feb 9 20:31:36.500370 kernel: raid6: avx2x4 xor() 14328 MB/s Feb 9 20:31:36.535370 kernel: raid6: avx2x2 gen() 51892 MB/s Feb 9 20:31:36.570401 kernel: raid6: avx2x2 xor() 32102 MB/s Feb 9 20:31:36.605400 kernel: raid6: avx2x1 gen() 44516 MB/s Feb 9 20:31:36.639398 kernel: raid6: avx2x1 xor() 27409 MB/s Feb 9 20:31:36.673404 kernel: raid6: sse2x4 gen() 20945 MB/s Feb 9 20:31:36.707400 kernel: raid6: sse2x4 xor() 11510 MB/s Feb 9 20:31:36.741404 kernel: raid6: sse2x2 gen() 21194 MB/s Feb 9 20:31:36.775421 kernel: raid6: sse2x2 xor() 13176 MB/s Feb 9 20:31:36.809400 kernel: raid6: sse2x1 gen() 17907 MB/s Feb 9 20:31:36.860995 kernel: raid6: sse2x1 xor() 8762 MB/s Feb 9 20:31:36.861010 kernel: raid6: using algorithm avx2x2 gen() 51892 MB/s Feb 9 20:31:36.861018 kernel: raid6: .... xor() 32102 MB/s, rmw enabled Feb 9 20:31:36.879054 kernel: raid6: using avx2x2 recovery algorithm Feb 9 20:31:36.925399 kernel: xor: automatically using best checksumming function avx Feb 9 20:31:37.003375 kernel: Btrfs loaded, crc32c=crc32c-intel, zoned=no, fsverity=no Feb 9 20:31:37.008433 systemd[1]: Finished dracut-pre-udev.service. Feb 9 20:31:37.017000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:37.017000 audit: BPF prog-id=7 op=LOAD Feb 9 20:31:37.017000 audit: BPF prog-id=8 op=LOAD Feb 9 20:31:37.018333 systemd[1]: Starting systemd-udevd.service... Feb 9 20:31:37.026528 systemd-udevd[478]: Using default interface naming scheme 'v252'. Feb 9 20:31:37.048000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:37.032638 systemd[1]: Started systemd-udevd.service. Feb 9 20:31:37.075471 dracut-pre-trigger[490]: rd.md=0: removing MD RAID activation Feb 9 20:31:37.048964 systemd[1]: Starting dracut-pre-trigger.service... Feb 9 20:31:37.090000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:37.078353 systemd[1]: Finished dracut-pre-trigger.service. Feb 9 20:31:37.092723 systemd[1]: Starting systemd-udev-trigger.service... Feb 9 20:31:37.143472 systemd[1]: Finished systemd-udev-trigger.service. Feb 9 20:31:37.177864 kernel: cryptd: max_cpu_qlen set to 1000 Feb 9 20:31:37.177878 kernel: ACPI: bus type USB registered Feb 9 20:31:37.142000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:37.208363 kernel: usbcore: registered new interface driver usbfs Feb 9 20:31:37.208386 kernel: usbcore: registered new interface driver hub Feb 9 20:31:37.225970 kernel: usbcore: registered new device driver usb Feb 9 20:31:37.284301 kernel: AVX2 version of gcm_enc/dec engaged. Feb 9 20:31:37.284348 kernel: AES CTR mode by8 optimization enabled Feb 9 20:31:37.284358 kernel: mlx5_core 0000:01:00.0: firmware version: 14.31.1014 Feb 9 20:31:37.322117 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 9 20:31:37.324348 kernel: libata version 3.00 loaded. Feb 9 20:31:37.362244 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 9 20:31:37.362390 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Feb 9 20:31:37.399273 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Feb 9 20:31:37.399296 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Feb 9 20:31:37.399407 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Feb 9 20:31:37.416346 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 9 20:31:37.449784 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Feb 9 20:31:37.449867 kernel: ahci 0000:00:17.0: version 3.0 Feb 9 20:31:37.449942 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Feb 9 20:31:37.455344 kernel: ahci 0000:00:17.0: AHCI 0001.0301 32 slots 7 ports 6 Gbps 0x7f impl SATA mode Feb 9 20:31:37.455428 kernel: pps pps0: new PPS source ptp0 Feb 9 20:31:37.455512 kernel: igb 0000:03:00.0: added PHC on eth0 Feb 9 20:31:37.455593 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 9 20:31:37.455669 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:75:7c Feb 9 20:31:37.455743 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Feb 9 20:31:37.455818 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 9 20:31:37.485483 kernel: hub 1-0:1.0: USB hub found Feb 9 20:31:37.485581 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Feb 9 20:31:37.492342 kernel: pps pps1: new PPS source ptp1 Feb 9 20:31:37.492431 kernel: igb 0000:04:00.0: added PHC on eth1 Feb 9 20:31:37.492512 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 9 20:31:37.492586 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:75:7d Feb 9 20:31:37.492659 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Feb 9 20:31:37.492732 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 9 20:31:37.513087 kernel: hub 1-0:1.0: 16 ports detected Feb 9 20:31:37.529343 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Feb 9 20:31:37.531343 kernel: hub 2-0:1.0: USB hub found Feb 9 20:31:37.531445 kernel: scsi host0: ahci Feb 9 20:31:37.531526 kernel: scsi host1: ahci Feb 9 20:31:37.531601 kernel: scsi host2: ahci Feb 9 20:31:37.531674 kernel: scsi host3: ahci Feb 9 20:31:37.531747 kernel: scsi host4: ahci Feb 9 20:31:37.531825 kernel: scsi host5: ahci Feb 9 20:31:37.531899 kernel: scsi host6: ahci Feb 9 20:31:37.531973 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 133 Feb 9 20:31:37.531985 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 133 Feb 9 20:31:37.531996 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 133 Feb 9 20:31:37.532007 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 133 Feb 9 20:31:37.532018 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 133 Feb 9 20:31:37.532029 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 133 Feb 9 20:31:37.532041 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 133 Feb 9 20:31:37.603395 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(128) max mc(2048) Feb 9 20:31:37.603471 kernel: hub 2-0:1.0: 10 ports detected Feb 9 20:31:37.711367 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Feb 9 20:31:37.711452 kernel: usb: port power management may be unreliable Feb 9 20:31:37.833393 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Feb 9 20:31:37.849380 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 9 20:31:37.864343 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 9 20:31:37.864421 kernel: ata4: SATA link down (SStatus 0 SControl 300) Feb 9 20:31:37.978395 kernel: hub 1-14:1.0: USB hub found Feb 9 20:31:37.978480 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 9 20:31:38.008610 kernel: hub 1-14:1.0: 4 ports detected Feb 9 20:31:38.008686 kernel: ata2.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Feb 9 20:31:38.075407 kernel: ata7: SATA link down (SStatus 0 SControl 300) Feb 9 20:31:38.088397 kernel: ata1.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Feb 9 20:31:38.103392 kernel: ata6: SATA link down (SStatus 0 SControl 300) Feb 9 20:31:38.117406 kernel: mlx5_core 0000:01:00.0: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 9 20:31:38.117595 kernel: ata3: SATA link down (SStatus 0 SControl 300) Feb 9 20:31:38.148350 kernel: mlx5_core 0000:01:00.1: firmware version: 14.31.1014 Feb 9 20:31:38.148537 kernel: ata5: SATA link down (SStatus 0 SControl 300) Feb 9 20:31:38.175689 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 9 20:31:38.193342 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 9 20:31:38.220835 kernel: ata2.00: Features: NCQ-prio Feb 9 20:31:38.220857 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 9 20:31:38.248422 kernel: ata1.00: Features: NCQ-prio Feb 9 20:31:38.265389 kernel: ata2.00: configured for UDMA/133 Feb 9 20:31:38.265405 kernel: ata1.00: configured for UDMA/133 Feb 9 20:31:38.294393 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Feb 9 20:31:38.294482 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Feb 9 20:31:38.310363 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Feb 9 20:31:38.345451 kernel: ata1.00: Enabling discard_zeroes_data Feb 9 20:31:38.345469 kernel: ata2.00: Enabling discard_zeroes_data Feb 9 20:31:38.358281 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 9 20:31:38.358385 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 9 20:31:38.389873 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Feb 9 20:31:38.389944 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Feb 9 20:31:38.403140 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 9 20:31:38.416400 kernel: sd 1:0:0:0: [sdb] Write Protect is off Feb 9 20:31:38.429298 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Feb 9 20:31:38.442241 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Feb 9 20:31:38.442315 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 9 20:31:38.442379 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 9 20:31:38.477350 kernel: ata2.00: Enabling discard_zeroes_data Feb 9 20:31:38.494389 kernel: hid: raw HID events driver (C) Jiri Kosina Feb 9 20:31:38.494405 kernel: ata2.00: Enabling discard_zeroes_data Feb 9 20:31:38.494412 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(128) max mc(2048) Feb 9 20:31:38.498394 kernel: port_module: 9 callbacks suppressed Feb 9 20:31:38.498411 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Feb 9 20:31:38.517896 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 9 20:31:38.517966 kernel: ata1.00: Enabling discard_zeroes_data Feb 9 20:31:38.599617 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Feb 9 20:31:38.614396 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 9 20:31:38.644701 kernel: GPT:9289727 != 937703087 Feb 9 20:31:38.644716 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 9 20:31:38.660588 kernel: GPT:9289727 != 937703087 Feb 9 20:31:38.673925 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 9 20:31:38.688802 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 9 20:31:38.718103 kernel: ata1.00: Enabling discard_zeroes_data Feb 9 20:31:38.718148 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 9 20:31:38.760782 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Feb 9 20:31:38.869703 kernel: usbcore: registered new interface driver usbhid Feb 9 20:31:38.869719 kernel: usbhid: USB HID core driver Feb 9 20:31:38.869730 kernel: mlx5_core 0000:01:00.1: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 9 20:31:38.869802 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (545) Feb 9 20:31:38.869810 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Feb 9 20:31:38.869863 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Feb 9 20:31:38.869871 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Feb 9 20:31:38.818442 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Feb 9 20:31:38.901471 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Feb 9 20:31:39.004196 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Feb 9 20:31:39.004326 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Feb 9 20:31:39.004335 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Feb 9 20:31:38.926099 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Feb 9 20:31:39.021760 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Feb 9 20:31:39.035931 systemd[1]: Starting disk-uuid.service... Feb 9 20:31:39.071472 kernel: ata1.00: Enabling discard_zeroes_data Feb 9 20:31:39.071483 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 9 20:31:39.071532 disk-uuid[692]: Primary Header is updated. Feb 9 20:31:39.071532 disk-uuid[692]: Secondary Entries is updated. Feb 9 20:31:39.071532 disk-uuid[692]: Secondary Header is updated. Feb 9 20:31:39.143468 kernel: ata1.00: Enabling discard_zeroes_data Feb 9 20:31:39.143481 kernel: GPT:disk_guids don't match. Feb 9 20:31:39.143488 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 9 20:31:39.143495 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 9 20:31:39.143501 kernel: ata1.00: Enabling discard_zeroes_data Feb 9 20:31:39.183343 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 9 20:31:40.131251 kernel: ata1.00: Enabling discard_zeroes_data Feb 9 20:31:40.150315 disk-uuid[693]: The operation has completed successfully. Feb 9 20:31:40.159454 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 9 20:31:40.189036 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 9 20:31:40.303463 kernel: audit: type=1130 audit(1707510700.197:19): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.303478 kernel: audit: type=1131 audit(1707510700.197:20): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.303488 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 9 20:31:40.197000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.197000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.189095 systemd[1]: Finished disk-uuid.service. Feb 9 20:31:40.198061 systemd[1]: Starting verity-setup.service... Feb 9 20:31:40.347399 systemd[1]: Found device dev-mapper-usr.device. Feb 9 20:31:40.348224 systemd[1]: Mounting sysusr-usr.mount... Feb 9 20:31:40.362611 systemd[1]: Finished verity-setup.service. Feb 9 20:31:40.381000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.429346 kernel: audit: type=1130 audit(1707510700.381:21): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.458291 systemd[1]: Mounted sysusr-usr.mount. Feb 9 20:31:40.474437 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Feb 9 20:31:40.466643 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Feb 9 20:31:40.467037 systemd[1]: Starting ignition-setup.service... Feb 9 20:31:40.565461 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 9 20:31:40.565474 kernel: BTRFS info (device sda6): using free space tree Feb 9 20:31:40.565481 kernel: BTRFS info (device sda6): has skinny extents Feb 9 20:31:40.565488 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 9 20:31:40.474851 systemd[1]: Starting parse-ip-for-networkd.service... Feb 9 20:31:40.574000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.559289 systemd[1]: Finished parse-ip-for-networkd.service. Feb 9 20:31:40.679955 kernel: audit: type=1130 audit(1707510700.574:22): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.679971 kernel: audit: type=1130 audit(1707510700.631:23): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.631000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.574893 systemd[1]: Finished ignition-setup.service. Feb 9 20:31:40.711011 kernel: audit: type=1334 audit(1707510700.687:24): prog-id=9 op=LOAD Feb 9 20:31:40.687000 audit: BPF prog-id=9 op=LOAD Feb 9 20:31:40.632026 systemd[1]: Starting ignition-fetch-offline.service... Feb 9 20:31:40.689375 systemd[1]: Starting systemd-networkd.service... Feb 9 20:31:40.725300 systemd-networkd[880]: lo: Link UP Feb 9 20:31:40.741000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.749144 ignition[868]: Ignition 2.14.0 Feb 9 20:31:40.806463 kernel: audit: type=1130 audit(1707510700.741:25): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.725303 systemd-networkd[880]: lo: Gained carrier Feb 9 20:31:40.749148 ignition[868]: Stage: fetch-offline Feb 9 20:31:40.817000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.725687 systemd-networkd[880]: Enumeration completed Feb 9 20:31:40.931067 kernel: audit: type=1130 audit(1707510700.817:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.931081 kernel: audit: type=1130 audit(1707510700.878:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.878000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.749172 ignition[868]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 9 20:31:40.725731 systemd[1]: Started systemd-networkd.service. Feb 9 20:31:40.749186 ignition[868]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 9 20:31:40.726334 systemd-networkd[880]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 9 20:31:40.968000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.755880 ignition[868]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 9 20:31:41.035396 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 9 20:31:41.035488 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f1np1: link becomes ready Feb 9 20:31:41.035497 iscsid[909]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Feb 9 20:31:41.035497 iscsid[909]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log Feb 9 20:31:41.035497 iscsid[909]: into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Feb 9 20:31:41.035497 iscsid[909]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Feb 9 20:31:41.035497 iscsid[909]: If using hardware iscsi like qla4xxx this message can be ignored. Feb 9 20:31:41.035497 iscsid[909]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Feb 9 20:31:41.035497 iscsid[909]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Feb 9 20:31:41.043000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.742506 systemd[1]: Reached target network.target. Feb 9 20:31:41.181000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:40.755942 ignition[868]: parsed url from cmdline: "" Feb 9 20:31:40.777368 unknown[868]: fetched base config from "system" Feb 9 20:31:40.755945 ignition[868]: no config URL provided Feb 9 20:31:40.777373 unknown[868]: fetched user config from "system" Feb 9 20:31:40.755948 ignition[868]: reading system config file "/usr/lib/ignition/user.ign" Feb 9 20:31:40.800909 systemd[1]: Starting iscsiuio.service... Feb 9 20:31:40.755977 ignition[868]: parsing config with SHA512: 3b57f86ae4b840f25d2371565658eee9f5491034ea61da11cfa5cba450a420e4bd3222a3d3cc0c9561742f9391e6cd6a8f5390599942f279c2aad5d9f9348b20 Feb 9 20:31:40.806607 systemd[1]: Started iscsiuio.service. Feb 9 20:31:40.777757 ignition[868]: fetch-offline: fetch-offline passed Feb 9 20:31:40.818535 systemd[1]: Finished ignition-fetch-offline.service. Feb 9 20:31:41.279460 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 9 20:31:40.777760 ignition[868]: POST message to Packet Timeline Feb 9 20:31:40.878577 systemd[1]: ignition-fetch.service was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 9 20:31:40.777764 ignition[868]: POST Status error: resource requires networking Feb 9 20:31:40.879029 systemd[1]: Starting ignition-kargs.service... Feb 9 20:31:40.777794 ignition[868]: Ignition finished successfully Feb 9 20:31:40.938029 systemd[1]: Starting iscsid.service... Feb 9 20:31:40.935273 ignition[898]: Ignition 2.14.0 Feb 9 20:31:40.955635 systemd[1]: Started iscsid.service. Feb 9 20:31:40.935276 ignition[898]: Stage: kargs Feb 9 20:31:40.969878 systemd[1]: Starting dracut-initqueue.service... Feb 9 20:31:40.935333 ignition[898]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 9 20:31:41.000986 systemd-networkd[880]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 9 20:31:40.935344 ignition[898]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 9 20:31:41.013535 systemd[1]: Finished dracut-initqueue.service. Feb 9 20:31:40.936602 ignition[898]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 9 20:31:41.043585 systemd[1]: Reached target remote-fs-pre.target. Feb 9 20:31:40.938568 ignition[898]: kargs: kargs passed Feb 9 20:31:41.088404 systemd[1]: Reached target remote-cryptsetup.target. Feb 9 20:31:40.938571 ignition[898]: POST message to Packet Timeline Feb 9 20:31:41.126531 systemd[1]: Reached target remote-fs.target. Feb 9 20:31:40.938582 ignition[898]: GET https://metadata.packet.net/metadata: attempt #1 Feb 9 20:31:41.144323 systemd[1]: Starting dracut-pre-mount.service... Feb 9 20:31:40.939994 ignition[898]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:45055->[::1]:53: read: connection refused Feb 9 20:31:41.162837 systemd[1]: Finished dracut-pre-mount.service. Feb 9 20:31:41.140470 ignition[898]: GET https://metadata.packet.net/metadata: attempt #2 Feb 9 20:31:41.272945 systemd-networkd[880]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 9 20:31:41.140871 ignition[898]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47029->[::1]:53: read: connection refused Feb 9 20:31:41.301642 systemd-networkd[880]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 9 20:31:41.330443 systemd-networkd[880]: enp1s0f1np1: Link UP Feb 9 20:31:41.330716 systemd-networkd[880]: enp1s0f1np1: Gained carrier Feb 9 20:31:41.540977 ignition[898]: GET https://metadata.packet.net/metadata: attempt #3 Feb 9 20:31:41.339814 systemd-networkd[880]: enp1s0f0np0: Link UP Feb 9 20:31:41.542112 ignition[898]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:50200->[::1]:53: read: connection refused Feb 9 20:31:41.340163 systemd-networkd[880]: eno2: Link UP Feb 9 20:31:41.340507 systemd-networkd[880]: eno1: Link UP Feb 9 20:31:42.055895 systemd-networkd[880]: enp1s0f0np0: Gained carrier Feb 9 20:31:42.064586 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f0np0: link becomes ready Feb 9 20:31:42.091562 systemd-networkd[880]: enp1s0f0np0: DHCPv4 address 86.109.11.101/31, gateway 86.109.11.100 acquired from 145.40.83.140 Feb 9 20:31:42.342651 ignition[898]: GET https://metadata.packet.net/metadata: attempt #4 Feb 9 20:31:42.343501 ignition[898]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47864->[::1]:53: read: connection refused Feb 9 20:31:42.800847 systemd-networkd[880]: enp1s0f1np1: Gained IPv6LL Feb 9 20:31:43.696928 systemd-networkd[880]: enp1s0f0np0: Gained IPv6LL Feb 9 20:31:43.944493 ignition[898]: GET https://metadata.packet.net/metadata: attempt #5 Feb 9 20:31:43.945606 ignition[898]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:34073->[::1]:53: read: connection refused Feb 9 20:31:47.148084 ignition[898]: GET https://metadata.packet.net/metadata: attempt #6 Feb 9 20:31:47.182509 ignition[898]: GET result: OK Feb 9 20:31:47.392700 ignition[898]: Ignition finished successfully Feb 9 20:31:47.397157 systemd[1]: Finished ignition-kargs.service. Feb 9 20:31:47.484531 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 9 20:31:47.484547 kernel: audit: type=1130 audit(1707510707.407:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:47.407000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:47.416927 ignition[927]: Ignition 2.14.0 Feb 9 20:31:47.410735 systemd[1]: Starting ignition-disks.service... Feb 9 20:31:47.416931 ignition[927]: Stage: disks Feb 9 20:31:47.416987 ignition[927]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 9 20:31:47.416996 ignition[927]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 9 20:31:47.418298 ignition[927]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 9 20:31:47.420075 ignition[927]: disks: disks passed Feb 9 20:31:47.420078 ignition[927]: POST message to Packet Timeline Feb 9 20:31:47.420089 ignition[927]: GET https://metadata.packet.net/metadata: attempt #1 Feb 9 20:31:47.455410 ignition[927]: GET result: OK Feb 9 20:31:47.653819 ignition[927]: Ignition finished successfully Feb 9 20:31:47.656937 systemd[1]: Finished ignition-disks.service. Feb 9 20:31:47.669000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:47.670973 systemd[1]: Reached target initrd-root-device.target. Feb 9 20:31:47.755518 kernel: audit: type=1130 audit(1707510707.669:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:47.741520 systemd[1]: Reached target local-fs-pre.target. Feb 9 20:31:47.741635 systemd[1]: Reached target local-fs.target. Feb 9 20:31:47.755648 systemd[1]: Reached target sysinit.target. Feb 9 20:31:47.770634 systemd[1]: Reached target basic.target. Feb 9 20:31:47.791435 systemd[1]: Starting systemd-fsck-root.service... Feb 9 20:31:47.813123 systemd-fsck[943]: ROOT: clean, 602/553520 files, 56014/553472 blocks Feb 9 20:31:47.824882 systemd[1]: Finished systemd-fsck-root.service. Feb 9 20:31:47.914763 kernel: audit: type=1130 audit(1707510707.832:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:47.914781 kernel: EXT4-fs (sda9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Feb 9 20:31:47.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:47.839164 systemd[1]: Mounting sysroot.mount... Feb 9 20:31:47.923042 systemd[1]: Mounted sysroot.mount. Feb 9 20:31:47.936675 systemd[1]: Reached target initrd-root-fs.target. Feb 9 20:31:47.944275 systemd[1]: Mounting sysroot-usr.mount... Feb 9 20:31:47.969207 systemd[1]: Starting flatcar-metadata-hostname.service... Feb 9 20:31:47.979004 systemd[1]: Starting flatcar-static-network.service... Feb 9 20:31:47.986601 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 9 20:31:47.986637 systemd[1]: Reached target ignition-diskful.target. Feb 9 20:31:48.010990 systemd[1]: Mounted sysroot-usr.mount. Feb 9 20:31:48.034979 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 9 20:31:48.177609 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (954) Feb 9 20:31:48.177707 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 9 20:31:48.177717 kernel: BTRFS info (device sda6): using free space tree Feb 9 20:31:48.177724 kernel: BTRFS info (device sda6): has skinny extents Feb 9 20:31:48.177732 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 9 20:31:48.045798 systemd[1]: Starting initrd-setup-root.service... Feb 9 20:31:48.186000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.240269 coreos-metadata[950]: Feb 09 20:31:48.112 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 9 20:31:48.240269 coreos-metadata[950]: Feb 09 20:31:48.134 INFO Fetch successful Feb 9 20:31:48.240269 coreos-metadata[950]: Feb 09 20:31:48.152 INFO wrote hostname ci-3510.3.2-a-45f40c263c to /sysroot/etc/hostname Feb 9 20:31:48.375651 kernel: audit: type=1130 audit(1707510708.186:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.375663 kernel: audit: type=1130 audit(1707510708.248:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.375670 kernel: audit: type=1131 audit(1707510708.248:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.248000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.248000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.375711 coreos-metadata[951]: Feb 09 20:31:48.112 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 9 20:31:48.375711 coreos-metadata[951]: Feb 09 20:31:48.134 INFO Fetch successful Feb 9 20:31:48.468456 kernel: audit: type=1130 audit(1707510708.395:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.395000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.468494 initrd-setup-root[961]: cut: /sysroot/etc/passwd: No such file or directory Feb 9 20:31:48.153577 systemd[1]: Finished flatcar-metadata-hostname.service. Feb 9 20:31:48.491000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.521562 initrd-setup-root[969]: cut: /sysroot/etc/group: No such file or directory Feb 9 20:31:48.559542 kernel: audit: type=1130 audit(1707510708.491:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.187665 systemd[1]: flatcar-static-network.service: Deactivated successfully. Feb 9 20:31:48.569594 initrd-setup-root[977]: cut: /sysroot/etc/shadow: No such file or directory Feb 9 20:31:48.187703 systemd[1]: Finished flatcar-static-network.service. Feb 9 20:31:48.587678 initrd-setup-root[985]: cut: /sysroot/etc/gshadow: No such file or directory Feb 9 20:31:48.248595 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 9 20:31:48.605554 ignition[1027]: INFO : Ignition 2.14.0 Feb 9 20:31:48.605554 ignition[1027]: INFO : Stage: mount Feb 9 20:31:48.605554 ignition[1027]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 9 20:31:48.605554 ignition[1027]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 9 20:31:48.605554 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 9 20:31:48.605554 ignition[1027]: INFO : mount: mount passed Feb 9 20:31:48.605554 ignition[1027]: INFO : POST message to Packet Timeline Feb 9 20:31:48.605554 ignition[1027]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 9 20:31:48.605554 ignition[1027]: INFO : GET result: OK Feb 9 20:31:48.359084 systemd[1]: Finished initrd-setup-root.service. Feb 9 20:31:48.701000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.761497 kernel: audit: type=1130 audit(1707510708.701:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:31:48.761516 ignition[1027]: INFO : Ignition finished successfully Feb 9 20:31:48.395938 systemd[1]: Starting ignition-mount.service... Feb 9 20:31:48.460966 systemd[1]: Starting sysroot-boot.service... Feb 9 20:31:48.840444 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1041) Feb 9 20:31:48.840460 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 9 20:31:48.475857 systemd[1]: sysusr-usr-share-oem.mount: Deactivated successfully. Feb 9 20:31:48.887417 kernel: BTRFS info (device sda6): using free space tree Feb 9 20:31:48.887428 kernel: BTRFS info (device sda6): has skinny extents Feb 9 20:31:48.887435 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 9 20:31:48.475900 systemd[1]: sysroot-usr-share-oem.mount: Deactivated successfully. Feb 9 20:31:48.477755 systemd[1]: Finished sysroot-boot.service. Feb 9 20:31:48.691611 systemd[1]: Finished ignition-mount.service. Feb 9 20:31:48.704576 systemd[1]: Starting ignition-files.service... Feb 9 20:31:48.771162 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 9 20:31:48.967612 ignition[1060]: INFO : Ignition 2.14.0 Feb 9 20:31:48.967612 ignition[1060]: INFO : Stage: files Feb 9 20:31:48.967612 ignition[1060]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 9 20:31:48.967612 ignition[1060]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 9 20:31:48.967612 ignition[1060]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 9 20:31:48.967612 ignition[1060]: DEBUG : files: compiled without relabeling support, skipping Feb 9 20:31:48.967612 ignition[1060]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 9 20:31:48.967612 ignition[1060]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 9 20:31:48.967612 ignition[1060]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 9 20:31:48.967612 ignition[1060]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 9 20:31:48.967612 ignition[1060]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 9 20:31:48.967612 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 9 20:31:48.967612 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 9 20:31:48.920171 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 9 20:31:48.956226 unknown[1060]: wrote ssh authorized keys file for user: core Feb 9 20:31:49.289237 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 9 20:31:49.356120 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 9 20:31:49.356120 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Feb 9 20:31:49.389604 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Feb 9 20:31:49.389604 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.1.1.tgz" Feb 9 20:31:49.389604 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET https://github.com/containernetworking/plugins/releases/download/v1.1.1/cni-plugins-linux-amd64-v1.1.1.tgz: attempt #1 Feb 9 20:31:49.892798 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET result: OK Feb 9 20:31:49.980859 ignition[1060]: DEBUG : files: createFilesystemsFiles: createFiles: op(5): file matches expected sum of: 4d0ed0abb5951b9cf83cba938ef84bdc5b681f4ac869da8143974f6a53a3ff30c666389fa462b9d14d30af09bf03f6cdf77598c572f8fb3ea00cecdda467a48d Feb 9 20:31:50.007506 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.1.1.tgz" Feb 9 20:31:50.007506 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/crictl-v1.26.0-linux-amd64.tar.gz" Feb 9 20:31:50.007506 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/kubernetes-sigs/cri-tools/releases/download/v1.26.0/crictl-v1.26.0-linux-amd64.tar.gz: attempt #1 Feb 9 20:31:50.426809 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 9 20:31:50.480732 ignition[1060]: DEBUG : files: createFilesystemsFiles: createFiles: op(6): file matches expected sum of: a3a2c02a90b008686c20babaf272e703924db2a3e2a0d4e2a7c81d994cbc68c47458a4a354ecc243af095b390815c7f203348b9749351ae817bd52a522300449 Feb 9 20:31:50.505508 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/crictl-v1.26.0-linux-amd64.tar.gz" Feb 9 20:31:50.505508 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/opt/bin/kubectl" Feb 9 20:31:50.505508 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubectl: attempt #1 Feb 9 20:31:50.749905 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET result: OK Feb 9 20:31:50.913883 ignition[1060]: DEBUG : files: createFilesystemsFiles: createFiles: op(7): file matches expected sum of: 97840854134909d75a1a2563628cc4ba632067369ce7fc8a8a1e90a387d32dd7bfd73f4f5b5a82ef842088e7470692951eb7fc869c5f297dd740f855672ee628 Feb 9 20:31:50.939570 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/opt/bin/kubectl" Feb 9 20:31:50.939570 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/opt/bin/kubelet" Feb 9 20:31:50.939570 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubelet: attempt #1 Feb 9 20:31:51.073486 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET result: OK Feb 9 20:32:01.400864 ignition[1060]: DEBUG : files: createFilesystemsFiles: createFiles: op(8): file matches expected sum of: 40daf2a9b9e666c14b10e627da931bd79978628b1f23ef6429c1cb4fcba261f86ccff440c0dbb0070ee760fe55772b4fd279c4582dfbb17fa30bc94b7f00126b Feb 9 20:32:01.400864 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/opt/bin/kubelet" Feb 9 20:32:01.442643 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/opt/bin/kubeadm" Feb 9 20:32:01.442643 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(9): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubeadm: attempt #1 Feb 9 20:32:01.475550 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(9): GET result: OK Feb 9 20:32:05.452574 ignition[1060]: DEBUG : files: createFilesystemsFiles: createFiles: op(9): file matches expected sum of: 1c324cd645a7bf93d19d24c87498d9a17878eb1cc927e2680200ffeab2f85051ddec47d85b79b8e774042dc6726299ad3d7caf52c060701f00deba30dc33f660 Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/opt/bin/kubeadm" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/etc/docker/daemon.json" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/etc/docker/daemon.json" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/home/core/install.sh" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/home/core/install.sh" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(c): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(c): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(d): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(d): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(e): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(e): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(f): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(f): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(10): [started] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(10): oem config not found in "/usr/share/oem", looking on oem partition Feb 9 20:32:05.490398 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(11): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2691794273" Feb 9 20:32:05.813576 kernel: BTRFS info: devid 1 device path /dev/sda6 changed to /dev/disk/by-label/OEM scanned by ignition (1079) Feb 9 20:32:05.813593 kernel: audit: type=1130 audit(1707510725.726:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.726000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.813704 ignition[1060]: CRITICAL : files: createFilesystemsFiles: createFiles: op(10): op(11): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2691794273": device or resource busy Feb 9 20:32:05.813704 ignition[1060]: ERROR : files: createFilesystemsFiles: createFiles: op(10): failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem2691794273", trying btrfs: device or resource busy Feb 9 20:32:05.813704 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(12): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2691794273" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(12): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2691794273" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(13): [started] unmounting "/mnt/oem2691794273" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(13): [finished] unmounting "/mnt/oem2691794273" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: createFilesystemsFiles: createFiles: op(10): [finished] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(14): [started] processing unit "coreos-metadata-sshkeys@.service" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(14): [finished] processing unit "coreos-metadata-sshkeys@.service" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(15): [started] processing unit "packet-phone-home.service" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(15): [finished] processing unit "packet-phone-home.service" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(16): [started] processing unit "containerd.service" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(16): op(17): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(16): op(17): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(16): [finished] processing unit "containerd.service" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(18): [started] processing unit "prepare-cni-plugins.service" Feb 9 20:32:05.813704 ignition[1060]: INFO : files: op(18): op(19): [started] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 9 20:32:06.466774 kernel: audit: type=1130 audit(1707510725.850:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.466861 kernel: audit: type=1130 audit(1707510725.919:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.466906 kernel: audit: type=1131 audit(1707510725.919:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.466946 kernel: audit: type=1130 audit(1707510726.099:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.466986 kernel: audit: type=1131 audit(1707510726.099:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.467025 kernel: audit: type=1130 audit(1707510726.294:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.850000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.919000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.919000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.099000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.099000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.294000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.495156 systemd[1]: mnt-oem2691794273.mount: Deactivated successfully. Feb 9 20:32:06.473000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(18): op(19): [finished] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(18): [finished] processing unit "prepare-cni-plugins.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1a): [started] processing unit "prepare-critools.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1a): op(1b): [started] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1a): op(1b): [finished] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1a): [finished] processing unit "prepare-critools.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1c): [started] processing unit "prepare-helm.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1c): op(1d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1c): op(1d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1c): [finished] processing unit "prepare-helm.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1e): [started] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1e): [finished] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1f): [started] setting preset to enabled for "packet-phone-home.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(1f): [finished] setting preset to enabled for "packet-phone-home.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(20): [started] setting preset to enabled for "prepare-cni-plugins.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(20): [finished] setting preset to enabled for "prepare-cni-plugins.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(21): [started] setting preset to enabled for "prepare-critools.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(21): [finished] setting preset to enabled for "prepare-critools.service" Feb 9 20:32:06.544267 ignition[1060]: INFO : files: op(22): [started] setting preset to enabled for "prepare-helm.service" Feb 9 20:32:07.056646 kernel: audit: type=1131 audit(1707510726.473:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.056837 kernel: audit: type=1131 audit(1707510726.800:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.056882 kernel: audit: type=1131 audit(1707510726.892:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.800000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.892000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.961000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.718848 systemd[1]: Finished ignition-files.service. Feb 9 20:32:07.071065 ignition[1060]: INFO : files: op(22): [finished] setting preset to enabled for "prepare-helm.service" Feb 9 20:32:07.071065 ignition[1060]: INFO : files: createResultFile: createFiles: op(23): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 9 20:32:07.071065 ignition[1060]: INFO : files: createResultFile: createFiles: op(23): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 9 20:32:07.071065 ignition[1060]: INFO : files: files passed Feb 9 20:32:07.071065 ignition[1060]: INFO : POST message to Packet Timeline Feb 9 20:32:07.071065 ignition[1060]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 9 20:32:07.071065 ignition[1060]: INFO : GET result: OK Feb 9 20:32:07.071065 ignition[1060]: INFO : Ignition finished successfully Feb 9 20:32:07.079000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.100000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.125000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.186000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.733656 systemd[1]: Starting initrd-setup-root-after-ignition.service... Feb 9 20:32:07.217000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.228758 initrd-setup-root-after-ignition[1095]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 9 20:32:07.236000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.795613 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Feb 9 20:32:07.257000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.795936 systemd[1]: Starting ignition-quench.service... Feb 9 20:32:07.282000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:05.820637 systemd[1]: Finished initrd-setup-root-after-ignition.service. Feb 9 20:32:05.851776 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 9 20:32:05.851858 systemd[1]: Finished ignition-quench.service. Feb 9 20:32:05.919597 systemd[1]: Reached target ignition-complete.target. Feb 9 20:32:06.040931 systemd[1]: Starting initrd-parse-etc.service... Feb 9 20:32:07.358000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.367567 ignition[1110]: INFO : Ignition 2.14.0 Feb 9 20:32:07.367567 ignition[1110]: INFO : Stage: umount Feb 9 20:32:07.367567 ignition[1110]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 9 20:32:07.367567 ignition[1110]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 9 20:32:07.367567 ignition[1110]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 9 20:32:07.367567 ignition[1110]: INFO : umount: umount passed Feb 9 20:32:07.367567 ignition[1110]: INFO : POST message to Packet Timeline Feb 9 20:32:07.367567 ignition[1110]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 9 20:32:07.367567 ignition[1110]: INFO : GET result: OK Feb 9 20:32:07.375000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.389000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.389000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.389000 audit: BPF prog-id=6 op=UNLOAD Feb 9 20:32:07.450000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.470000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.487000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.497000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.522000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.064174 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 9 20:32:07.538000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.548689 ignition[1110]: INFO : Ignition finished successfully Feb 9 20:32:07.555000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.064217 systemd[1]: Finished initrd-parse-etc.service. Feb 9 20:32:07.571000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.100575 systemd[1]: Reached target initrd-fs.target. Feb 9 20:32:06.222587 systemd[1]: Reached target initrd.target. Feb 9 20:32:07.601000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.241584 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Feb 9 20:32:06.241978 systemd[1]: Starting dracut-pre-pivot.service... Feb 9 20:32:06.272704 systemd[1]: Finished dracut-pre-pivot.service. Feb 9 20:32:07.651000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.295638 systemd[1]: Starting initrd-cleanup.service... Feb 9 20:32:07.667000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.363281 systemd[1]: Stopped target nss-lookup.target. Feb 9 20:32:07.682000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.408640 systemd[1]: Stopped target remote-cryptsetup.target. Feb 9 20:32:06.427678 systemd[1]: Stopped target timers.target. Feb 9 20:32:07.711000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.447991 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 9 20:32:07.726000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:07.726000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.448376 systemd[1]: Stopped dracut-pre-pivot.service. Feb 9 20:32:06.475175 systemd[1]: Stopped target initrd.target. Feb 9 20:32:06.551572 systemd[1]: Stopped target basic.target. Feb 9 20:32:06.570713 systemd[1]: Stopped target ignition-complete.target. Feb 9 20:32:06.589708 systemd[1]: Stopped target ignition-diskful.target. Feb 9 20:32:06.616683 systemd[1]: Stopped target initrd-root-device.target. Feb 9 20:32:06.641940 systemd[1]: Stopped target remote-fs.target. Feb 9 20:32:06.667877 systemd[1]: Stopped target remote-fs-pre.target. Feb 9 20:32:07.795000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:06.686960 systemd[1]: Stopped target sysinit.target. Feb 9 20:32:06.706944 systemd[1]: Stopped target local-fs.target. Feb 9 20:32:06.732937 systemd[1]: Stopped target local-fs-pre.target. Feb 9 20:32:07.839000 audit: BPF prog-id=5 op=UNLOAD Feb 9 20:32:06.759033 systemd[1]: Stopped target swap.target. Feb 9 20:32:07.839000 audit: BPF prog-id=4 op=UNLOAD Feb 9 20:32:07.839000 audit: BPF prog-id=3 op=UNLOAD Feb 9 20:32:07.842000 audit: BPF prog-id=8 op=UNLOAD Feb 9 20:32:07.842000 audit: BPF prog-id=7 op=UNLOAD Feb 9 20:32:06.777926 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 9 20:32:06.778280 systemd[1]: Stopped dracut-pre-mount.service. Feb 9 20:32:06.801253 systemd[1]: Stopped target cryptsetup.target. Feb 9 20:32:07.895586 iscsid[909]: iscsid shutting down. Feb 9 20:32:06.879547 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 9 20:32:06.879618 systemd[1]: Stopped dracut-initqueue.service. Feb 9 20:32:06.892747 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 9 20:32:06.892824 systemd[1]: Stopped ignition-fetch-offline.service. Feb 9 20:32:06.961680 systemd[1]: Stopped target paths.target. Feb 9 20:32:06.976699 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 9 20:32:06.980564 systemd[1]: Stopped systemd-ask-password-console.path. Feb 9 20:32:07.003758 systemd[1]: Stopped target slices.target. Feb 9 20:32:07.016721 systemd[1]: Stopped target sockets.target. Feb 9 20:32:07.043719 systemd[1]: iscsid.socket: Deactivated successfully. Feb 9 20:32:07.043852 systemd[1]: Closed iscsid.socket. Feb 9 20:32:07.064107 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 9 20:32:07.064492 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Feb 9 20:32:07.081034 systemd[1]: ignition-files.service: Deactivated successfully. Feb 9 20:32:07.081397 systemd[1]: Stopped ignition-files.service. Feb 9 20:32:07.102028 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Feb 9 20:32:07.102400 systemd[1]: Stopped flatcar-metadata-hostname.service. Feb 9 20:32:07.128178 systemd[1]: Stopping ignition-mount.service... Feb 9 20:32:07.149556 systemd[1]: Stopping iscsiuio.service... Feb 9 20:32:07.157497 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 9 20:32:07.157585 systemd[1]: Stopped kmod-static-nodes.service. Feb 9 20:32:07.188304 systemd[1]: Stopping sysroot-boot.service... Feb 9 20:32:07.202482 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 9 20:32:07.202770 systemd[1]: Stopped systemd-udev-trigger.service. Feb 9 20:32:07.219054 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 9 20:32:07.219396 systemd[1]: Stopped dracut-pre-trigger.service. Feb 9 20:32:07.246414 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 9 20:32:07.248151 systemd[1]: iscsiuio.service: Deactivated successfully. Feb 9 20:32:07.248402 systemd[1]: Stopped iscsiuio.service. Feb 9 20:32:07.259838 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 9 20:32:07.260137 systemd[1]: Stopped sysroot-boot.service. Feb 9 20:32:07.285075 systemd[1]: Stopped target network.target. Feb 9 20:32:07.298868 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 9 20:32:07.298967 systemd[1]: Closed iscsiuio.socket. Feb 9 20:32:07.313947 systemd[1]: Stopping systemd-networkd.service... Feb 9 20:32:07.319520 systemd-networkd[880]: enp1s0f0np0: DHCPv6 lease lost Feb 9 20:32:07.328544 systemd-networkd[880]: enp1s0f1np1: DHCPv6 lease lost Feb 9 20:32:07.896412 systemd-journald[267]: Received SIGTERM from PID 1 (n/a). Feb 9 20:32:07.894000 audit: BPF prog-id=9 op=UNLOAD Feb 9 20:32:07.329834 systemd[1]: Stopping systemd-resolved.service... Feb 9 20:32:07.344512 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 9 20:32:07.344739 systemd[1]: Stopped systemd-resolved.service. Feb 9 20:32:07.360385 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 9 20:32:07.360618 systemd[1]: Stopped systemd-networkd.service. Feb 9 20:32:07.375635 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 9 20:32:07.375676 systemd[1]: Finished initrd-cleanup.service. Feb 9 20:32:07.392924 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 9 20:32:07.393032 systemd[1]: Closed systemd-networkd.socket. Feb 9 20:32:07.407226 systemd[1]: Stopping network-cleanup.service... Feb 9 20:32:07.423550 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 9 20:32:07.423690 systemd[1]: Stopped parse-ip-for-networkd.service. Feb 9 20:32:07.451812 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 9 20:32:07.451967 systemd[1]: Stopped systemd-sysctl.service. Feb 9 20:32:07.472077 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 9 20:32:07.472220 systemd[1]: Stopped systemd-modules-load.service. Feb 9 20:32:07.492452 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 9 20:32:07.493750 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 9 20:32:07.493979 systemd[1]: Stopped ignition-mount.service. Feb 9 20:32:07.498090 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 9 20:32:07.498127 systemd[1]: Stopped ignition-disks.service. Feb 9 20:32:07.523512 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 9 20:32:07.523564 systemd[1]: Stopped ignition-kargs.service. Feb 9 20:32:07.539552 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 9 20:32:07.539606 systemd[1]: Stopped ignition-setup.service. Feb 9 20:32:07.556502 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 9 20:32:07.556541 systemd[1]: Stopped initrd-setup-root.service. Feb 9 20:32:07.572651 systemd[1]: Stopping systemd-udevd.service... Feb 9 20:32:07.586716 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 9 20:32:07.586812 systemd[1]: Stopped systemd-udevd.service. Feb 9 20:32:07.603591 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 9 20:32:07.603640 systemd[1]: Closed systemd-udevd-control.socket. Feb 9 20:32:07.617689 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 9 20:32:07.617758 systemd[1]: Closed systemd-udevd-kernel.socket. Feb 9 20:32:07.636680 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 9 20:32:07.636808 systemd[1]: Stopped dracut-pre-udev.service. Feb 9 20:32:07.652745 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 9 20:32:07.652871 systemd[1]: Stopped dracut-cmdline.service. Feb 9 20:32:07.668747 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 9 20:32:07.668871 systemd[1]: Stopped dracut-cmdline-ask.service. Feb 9 20:32:07.685579 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Feb 9 20:32:07.698522 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 9 20:32:07.698547 systemd[1]: Stopped systemd-vconsole-setup.service. Feb 9 20:32:07.712690 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 9 20:32:07.712735 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Feb 9 20:32:07.783439 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 9 20:32:07.783662 systemd[1]: Stopped network-cleanup.service. Feb 9 20:32:07.796977 systemd[1]: Reached target initrd-switch-root.target. Feb 9 20:32:07.814308 systemd[1]: Starting initrd-switch-root.service... Feb 9 20:32:07.832631 systemd[1]: Switching root. Feb 9 20:32:07.897355 systemd-journald[267]: Journal stopped Feb 9 20:32:11.722507 kernel: SELinux: Class mctp_socket not defined in policy. Feb 9 20:32:11.722521 kernel: SELinux: Class anon_inode not defined in policy. Feb 9 20:32:11.722530 kernel: SELinux: the above unknown classes and permissions will be allowed Feb 9 20:32:11.722536 kernel: SELinux: policy capability network_peer_controls=1 Feb 9 20:32:11.722541 kernel: SELinux: policy capability open_perms=1 Feb 9 20:32:11.722547 kernel: SELinux: policy capability extended_socket_class=1 Feb 9 20:32:11.722554 kernel: SELinux: policy capability always_check_network=0 Feb 9 20:32:11.722559 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 9 20:32:11.722564 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 9 20:32:11.722570 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 9 20:32:11.722576 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 9 20:32:11.722582 systemd[1]: Successfully loaded SELinux policy in 319.977ms. Feb 9 20:32:11.722590 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 6.167ms. Feb 9 20:32:11.722596 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 9 20:32:11.722604 systemd[1]: Detected architecture x86-64. Feb 9 20:32:11.722611 systemd[1]: Detected first boot. Feb 9 20:32:11.722617 systemd[1]: Hostname set to . Feb 9 20:32:11.722623 systemd[1]: Initializing machine ID from random generator. Feb 9 20:32:11.722629 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Feb 9 20:32:11.722635 systemd[1]: Populated /etc with preset unit settings. Feb 9 20:32:11.722641 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 9 20:32:11.722649 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 9 20:32:11.722657 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 9 20:32:11.722663 systemd[1]: Queued start job for default target multi-user.target. Feb 9 20:32:11.722670 systemd[1]: Created slice system-addon\x2dconfig.slice. Feb 9 20:32:11.722676 systemd[1]: Created slice system-addon\x2drun.slice. Feb 9 20:32:11.722683 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice. Feb 9 20:32:11.722690 systemd[1]: Created slice system-getty.slice. Feb 9 20:32:11.722696 systemd[1]: Created slice system-modprobe.slice. Feb 9 20:32:11.722703 systemd[1]: Created slice system-serial\x2dgetty.slice. Feb 9 20:32:11.722709 systemd[1]: Created slice system-system\x2dcloudinit.slice. Feb 9 20:32:11.722715 systemd[1]: Created slice system-systemd\x2dfsck.slice. Feb 9 20:32:11.722721 systemd[1]: Created slice user.slice. Feb 9 20:32:11.722727 systemd[1]: Started systemd-ask-password-console.path. Feb 9 20:32:11.722734 systemd[1]: Started systemd-ask-password-wall.path. Feb 9 20:32:11.722740 systemd[1]: Set up automount boot.automount. Feb 9 20:32:11.722747 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Feb 9 20:32:11.722754 systemd[1]: Reached target integritysetup.target. Feb 9 20:32:11.722760 systemd[1]: Reached target remote-cryptsetup.target. Feb 9 20:32:11.722766 systemd[1]: Reached target remote-fs.target. Feb 9 20:32:11.722774 systemd[1]: Reached target slices.target. Feb 9 20:32:11.722781 systemd[1]: Reached target swap.target. Feb 9 20:32:11.722787 systemd[1]: Reached target torcx.target. Feb 9 20:32:11.722794 systemd[1]: Reached target veritysetup.target. Feb 9 20:32:11.722802 systemd[1]: Listening on systemd-coredump.socket. Feb 9 20:32:11.722808 systemd[1]: Listening on systemd-initctl.socket. Feb 9 20:32:11.722815 kernel: kauditd_printk_skb: 48 callbacks suppressed Feb 9 20:32:11.722821 kernel: audit: type=1400 audit(1707510730.967:91): avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 9 20:32:11.722828 systemd[1]: Listening on systemd-journald-audit.socket. Feb 9 20:32:11.722834 kernel: audit: type=1335 audit(1707510730.967:92): pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Feb 9 20:32:11.722841 systemd[1]: Listening on systemd-journald-dev-log.socket. Feb 9 20:32:11.722847 systemd[1]: Listening on systemd-journald.socket. Feb 9 20:32:11.722855 systemd[1]: Listening on systemd-networkd.socket. Feb 9 20:32:11.722861 systemd[1]: Listening on systemd-udevd-control.socket. Feb 9 20:32:11.722868 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 9 20:32:11.722875 systemd[1]: Listening on systemd-userdbd.socket. Feb 9 20:32:11.722882 systemd[1]: Mounting dev-hugepages.mount... Feb 9 20:32:11.722889 systemd[1]: Mounting dev-mqueue.mount... Feb 9 20:32:11.722896 systemd[1]: Mounting media.mount... Feb 9 20:32:11.722902 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 9 20:32:11.722909 systemd[1]: Mounting sys-kernel-debug.mount... Feb 9 20:32:11.722916 systemd[1]: Mounting sys-kernel-tracing.mount... Feb 9 20:32:11.722922 systemd[1]: Mounting tmp.mount... Feb 9 20:32:11.722929 systemd[1]: Starting flatcar-tmpfiles.service... Feb 9 20:32:11.722935 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Feb 9 20:32:11.722943 systemd[1]: Starting kmod-static-nodes.service... Feb 9 20:32:11.722951 systemd[1]: Starting modprobe@configfs.service... Feb 9 20:32:11.722957 systemd[1]: Starting modprobe@dm_mod.service... Feb 9 20:32:11.722964 systemd[1]: Starting modprobe@drm.service... Feb 9 20:32:11.722971 systemd[1]: Starting modprobe@efi_pstore.service... Feb 9 20:32:11.722977 systemd[1]: Starting modprobe@fuse.service... Feb 9 20:32:11.722984 kernel: fuse: init (API version 7.34) Feb 9 20:32:11.722990 systemd[1]: Starting modprobe@loop.service... Feb 9 20:32:11.722996 kernel: loop: module loaded Feb 9 20:32:11.723003 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 9 20:32:11.723011 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Feb 9 20:32:11.723017 systemd[1]: (This warning is only shown for the first unit using IP firewalling.) Feb 9 20:32:11.723024 systemd[1]: Starting systemd-journald.service... Feb 9 20:32:11.723030 systemd[1]: Starting systemd-modules-load.service... Feb 9 20:32:11.723037 kernel: audit: type=1305 audit(1707510731.719:93): op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Feb 9 20:32:11.723045 systemd-journald[1301]: Journal started Feb 9 20:32:11.723071 systemd-journald[1301]: Runtime Journal (/run/log/journal/f4163e565d704f1896ea75b46885f08f) is 8.0M, max 640.1M, 632.1M free. Feb 9 20:32:10.967000 audit[1]: AVC avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 9 20:32:10.967000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Feb 9 20:32:11.719000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Feb 9 20:32:11.719000 audit[1301]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=4 a1=7ffd1d6aa980 a2=4000 a3=7ffd1d6aaa1c items=0 ppid=1 pid=1301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:11.719000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Feb 9 20:32:11.771403 kernel: audit: type=1300 audit(1707510731.719:93): arch=c000003e syscall=46 success=yes exit=60 a0=4 a1=7ffd1d6aa980 a2=4000 a3=7ffd1d6aaa1c items=0 ppid=1 pid=1301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:11.771440 kernel: audit: type=1327 audit(1707510731.719:93): proctitle="/usr/lib/systemd/systemd-journald" Feb 9 20:32:11.885524 systemd[1]: Starting systemd-network-generator.service... Feb 9 20:32:11.912344 systemd[1]: Starting systemd-remount-fs.service... Feb 9 20:32:11.939396 systemd[1]: Starting systemd-udev-trigger.service... Feb 9 20:32:11.982382 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 9 20:32:12.001370 systemd[1]: Started systemd-journald.service. Feb 9 20:32:12.009000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.010057 systemd[1]: Mounted dev-hugepages.mount. Feb 9 20:32:12.057524 kernel: audit: type=1130 audit(1707510732.009:94): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.064599 systemd[1]: Mounted dev-mqueue.mount. Feb 9 20:32:12.071582 systemd[1]: Mounted media.mount. Feb 9 20:32:12.078592 systemd[1]: Mounted sys-kernel-debug.mount. Feb 9 20:32:12.086600 systemd[1]: Mounted sys-kernel-tracing.mount. Feb 9 20:32:12.094560 systemd[1]: Mounted tmp.mount. Feb 9 20:32:12.101684 systemd[1]: Finished flatcar-tmpfiles.service. Feb 9 20:32:12.109000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.110700 systemd[1]: Finished kmod-static-nodes.service. Feb 9 20:32:12.158404 kernel: audit: type=1130 audit(1707510732.109:95): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.166650 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 9 20:32:12.166728 systemd[1]: Finished modprobe@configfs.service. Feb 9 20:32:12.215366 kernel: audit: type=1130 audit(1707510732.166:96): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.223000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.223661 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 9 20:32:12.223733 systemd[1]: Finished modprobe@dm_mod.service. Feb 9 20:32:12.223000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.274380 kernel: audit: type=1130 audit(1707510732.223:97): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.274409 kernel: audit: type=1131 audit(1707510732.223:98): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.334666 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 9 20:32:12.334739 systemd[1]: Finished modprobe@drm.service. Feb 9 20:32:12.344000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.344000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.344693 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 9 20:32:12.344783 systemd[1]: Finished modprobe@efi_pstore.service. Feb 9 20:32:12.354000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.354000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.354688 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 9 20:32:12.354775 systemd[1]: Finished modprobe@fuse.service. Feb 9 20:32:12.364000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.364673 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 9 20:32:12.364771 systemd[1]: Finished modprobe@loop.service. Feb 9 20:32:12.374000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.374000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.374776 systemd[1]: Finished systemd-modules-load.service. Feb 9 20:32:12.383000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.384728 systemd[1]: Finished systemd-network-generator.service. Feb 9 20:32:12.393000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.394777 systemd[1]: Finished systemd-remount-fs.service. Feb 9 20:32:12.403000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.404149 systemd[1]: Finished systemd-udev-trigger.service. Feb 9 20:32:12.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.413051 systemd[1]: Reached target network-pre.target. Feb 9 20:32:12.424046 systemd[1]: Mounting sys-fs-fuse-connections.mount... Feb 9 20:32:12.433038 systemd[1]: Mounting sys-kernel-config.mount... Feb 9 20:32:12.440542 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 9 20:32:12.441692 systemd[1]: Starting systemd-hwdb-update.service... Feb 9 20:32:12.449096 systemd[1]: Starting systemd-journal-flush.service... Feb 9 20:32:12.452574 systemd-journald[1301]: Time spent on flushing to /var/log/journal/f4163e565d704f1896ea75b46885f08f is 15.132ms for 1554 entries. Feb 9 20:32:12.452574 systemd-journald[1301]: System Journal (/var/log/journal/f4163e565d704f1896ea75b46885f08f) is 8.0M, max 195.6M, 187.6M free. Feb 9 20:32:12.501596 systemd-journald[1301]: Received client request to flush runtime journal. Feb 9 20:32:12.465486 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 9 20:32:12.466029 systemd[1]: Starting systemd-random-seed.service... Feb 9 20:32:12.483461 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Feb 9 20:32:12.484087 systemd[1]: Starting systemd-sysctl.service... Feb 9 20:32:12.491256 systemd[1]: Starting systemd-sysusers.service... Feb 9 20:32:12.499022 systemd[1]: Starting systemd-udev-settle.service... Feb 9 20:32:12.507641 systemd[1]: Mounted sys-fs-fuse-connections.mount. Feb 9 20:32:12.516517 systemd[1]: Mounted sys-kernel-config.mount. Feb 9 20:32:12.525608 systemd[1]: Finished systemd-journal-flush.service. Feb 9 20:32:12.532000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.533648 systemd[1]: Finished systemd-random-seed.service. Feb 9 20:32:12.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.541571 systemd[1]: Finished systemd-sysctl.service. Feb 9 20:32:12.548000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.549590 systemd[1]: Finished systemd-sysusers.service. Feb 9 20:32:12.556000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.558486 systemd[1]: Reached target first-boot-complete.target. Feb 9 20:32:12.567137 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Feb 9 20:32:12.575724 udevadm[1327]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 9 20:32:12.584509 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Feb 9 20:32:12.591000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.742753 systemd[1]: Finished systemd-hwdb-update.service. Feb 9 20:32:12.752000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.753287 systemd[1]: Starting systemd-udevd.service... Feb 9 20:32:12.764960 systemd-udevd[1336]: Using default interface naming scheme 'v252'. Feb 9 20:32:12.780712 systemd[1]: Started systemd-udevd.service. Feb 9 20:32:12.787000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:12.792036 systemd[1]: Found device dev-ttyS1.device. Feb 9 20:32:12.819453 systemd[1]: Starting systemd-networkd.service... Feb 9 20:32:12.821347 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Feb 9 20:32:12.869072 kernel: ACPI: button: Sleep Button [SLPB] Feb 9 20:32:12.869129 kernel: BTRFS info: devid 1 device path /dev/disk/by-label/OEM changed to /dev/sda6 scanned by (udev-worker) (1392) Feb 9 20:32:12.869151 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Feb 9 20:32:12.883438 systemd[1]: dev-disk-by\x2dlabel-OEM.device was skipped because of an unmet condition check (ConditionPathExists=!/usr/.noupdate). Feb 9 20:32:12.884785 systemd[1]: Starting systemd-userdbd.service... Feb 9 20:32:12.868000 audit[1349]: AVC avc: denied { confidentiality } for pid=1349 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 9 20:32:12.914356 kernel: IPMI message handler: version 39.2 Feb 9 20:32:12.914425 kernel: ACPI: button: Power Button [PWRF] Feb 9 20:32:12.934359 kernel: mousedev: PS/2 mouse device common for all mice Feb 9 20:32:12.868000 audit[1349]: SYSCALL arch=c000003e syscall=175 success=yes exit=0 a0=562ed0844b20 a1=4d8bc a2=7f1b4bb8dbc5 a3=5 items=42 ppid=1336 pid=1349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:12.868000 audit: CWD cwd="/" Feb 9 20:32:12.868000 audit: PATH item=0 name=(null) inode=45 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=1 name=(null) inode=20754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=2 name=(null) inode=20754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=3 name=(null) inode=20755 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=4 name=(null) inode=20754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=5 name=(null) inode=20756 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=6 name=(null) inode=20754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=7 name=(null) inode=20757 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=8 name=(null) inode=20757 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=9 name=(null) inode=20758 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=10 name=(null) inode=20757 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=11 name=(null) inode=20759 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=12 name=(null) inode=20757 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=13 name=(null) inode=20760 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=14 name=(null) inode=20757 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=15 name=(null) inode=20761 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=16 name=(null) inode=20757 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=17 name=(null) inode=20762 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=18 name=(null) inode=20754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=19 name=(null) inode=20763 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=20 name=(null) inode=20763 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=21 name=(null) inode=20764 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=22 name=(null) inode=20763 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=23 name=(null) inode=20765 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=24 name=(null) inode=20763 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=25 name=(null) inode=20766 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=26 name=(null) inode=20763 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=27 name=(null) inode=20767 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=28 name=(null) inode=20763 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=29 name=(null) inode=20768 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=30 name=(null) inode=20754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=31 name=(null) inode=20769 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=32 name=(null) inode=20769 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=33 name=(null) inode=20770 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=34 name=(null) inode=20769 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=35 name=(null) inode=20771 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=36 name=(null) inode=20769 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=37 name=(null) inode=20772 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=38 name=(null) inode=20769 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=39 name=(null) inode=20773 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=40 name=(null) inode=20769 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PATH item=41 name=(null) inode=20774 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 9 20:32:12.868000 audit: PROCTITLE proctitle="(udev-worker)" Feb 9 20:32:13.010827 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Feb 9 20:32:13.011017 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Feb 9 20:32:13.034346 kernel: i2c i2c-0: 1/4 memory slots populated (from DMI) Feb 9 20:32:13.043439 systemd[1]: Started systemd-userdbd.service. Feb 9 20:32:13.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:13.064346 kernel: ipmi device interface Feb 9 20:32:13.064382 kernel: iTCO_vendor_support: vendor-support=0 Feb 9 20:32:13.064414 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Feb 9 20:32:13.064526 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Feb 9 20:32:13.173572 kernel: ipmi_si: IPMI System Interface driver Feb 9 20:32:13.173629 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Feb 9 20:32:13.173730 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Feb 9 20:32:13.196766 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Feb 9 20:32:13.239755 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Feb 9 20:32:13.239888 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Feb 9 20:32:13.315377 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Feb 9 20:32:13.315485 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Feb 9 20:32:13.315556 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Feb 9 20:32:13.315631 kernel: ipmi_si: Adding ACPI-specified kcs state machine Feb 9 20:32:13.315646 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Feb 9 20:32:13.452687 kernel: intel_rapl_common: Found RAPL domain package Feb 9 20:32:13.452735 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Feb 9 20:32:13.452840 kernel: intel_rapl_common: Found RAPL domain core Feb 9 20:32:13.497558 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Feb 9 20:32:13.497672 kernel: intel_rapl_common: Found RAPL domain dram Feb 9 20:32:13.541763 systemd-networkd[1418]: bond0: netdev ready Feb 9 20:32:13.544149 systemd-networkd[1418]: lo: Link UP Feb 9 20:32:13.544151 systemd-networkd[1418]: lo: Gained carrier Feb 9 20:32:13.544723 systemd-networkd[1418]: Enumeration completed Feb 9 20:32:13.544850 systemd[1]: Started systemd-networkd.service. Feb 9 20:32:13.545042 systemd-networkd[1418]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Feb 9 20:32:13.552000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:13.556720 systemd-networkd[1418]: enp1s0f1np1: Configuring with /etc/systemd/network/10-b8:59:9f:de:84:f9.network. Feb 9 20:32:13.587343 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Feb 9 20:32:13.609345 kernel: ipmi_ssif: IPMI SSIF Interface driver Feb 9 20:32:13.611604 systemd[1]: Finished systemd-udev-settle.service. Feb 9 20:32:13.618000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:13.620156 systemd[1]: Starting lvm2-activation-early.service... Feb 9 20:32:13.636001 lvm[1445]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 9 20:32:13.671787 systemd[1]: Finished lvm2-activation-early.service. Feb 9 20:32:13.679000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:13.680511 systemd[1]: Reached target cryptsetup.target. Feb 9 20:32:13.689048 systemd[1]: Starting lvm2-activation.service... Feb 9 20:32:13.691164 lvm[1447]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 9 20:32:13.725746 systemd[1]: Finished lvm2-activation.service. Feb 9 20:32:13.732000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:13.733521 systemd[1]: Reached target local-fs-pre.target. Feb 9 20:32:13.741425 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 9 20:32:13.741439 systemd[1]: Reached target local-fs.target. Feb 9 20:32:13.749413 systemd[1]: Reached target machines.target. Feb 9 20:32:13.758077 systemd[1]: Starting ldconfig.service... Feb 9 20:32:13.764844 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Feb 9 20:32:13.764866 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 9 20:32:13.765450 systemd[1]: Starting systemd-boot-update.service... Feb 9 20:32:13.772875 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Feb 9 20:32:13.782976 systemd[1]: Starting systemd-machine-id-commit.service... Feb 9 20:32:13.783125 systemd[1]: systemd-sysext.service was skipped because no trigger condition checks were met. Feb 9 20:32:13.783150 systemd[1]: ensure-sysext.service was skipped because no trigger condition checks were met. Feb 9 20:32:13.783978 systemd[1]: Starting systemd-tmpfiles-setup.service... Feb 9 20:32:13.784163 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1450 (bootctl) Feb 9 20:32:13.784978 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Feb 9 20:32:13.801342 systemd-tmpfiles[1454]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Feb 9 20:32:13.803731 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Feb 9 20:32:13.802000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:13.820589 systemd-tmpfiles[1454]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 9 20:32:13.830955 systemd-tmpfiles[1454]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 9 20:32:13.969429 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 9 20:32:13.969800 systemd[1]: Finished systemd-machine-id-commit.service. Feb 9 20:32:13.968000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:14.012841 systemd-fsck[1459]: fsck.fat 4.2 (2021-01-31) Feb 9 20:32:14.012841 systemd-fsck[1459]: /dev/sda1: 789 files, 115339/258078 clusters Feb 9 20:32:14.013579 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Feb 9 20:32:14.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:14.024123 systemd[1]: Mounting boot.mount... Feb 9 20:32:14.036679 systemd[1]: Mounted boot.mount. Feb 9 20:32:14.055360 systemd[1]: Finished systemd-boot-update.service. Feb 9 20:32:14.062000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:14.087516 systemd[1]: Finished systemd-tmpfiles-setup.service. Feb 9 20:32:14.094000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:14.096165 systemd[1]: Starting audit-rules.service... Feb 9 20:32:14.104067 systemd[1]: Starting clean-ca-certificates.service... Feb 9 20:32:14.111843 augenrules[1485]: No rules Feb 9 20:32:14.110000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Feb 9 20:32:14.110000 audit[1485]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc834e79a0 a2=420 a3=0 items=0 ppid=1468 pid=1485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:14.110000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Feb 9 20:32:14.113085 systemd[1]: Starting systemd-journal-catalog-update.service... Feb 9 20:32:14.122221 systemd[1]: Starting systemd-resolved.service... Feb 9 20:32:14.130204 systemd[1]: Starting systemd-timesyncd.service... Feb 9 20:32:14.137996 systemd[1]: Starting systemd-update-utmp.service... Feb 9 20:32:14.144757 systemd[1]: Finished audit-rules.service. Feb 9 20:32:14.151593 systemd[1]: Finished clean-ca-certificates.service. Feb 9 20:32:14.159603 systemd[1]: Finished systemd-journal-catalog-update.service. Feb 9 20:32:14.171126 systemd[1]: Finished systemd-update-utmp.service. Feb 9 20:32:14.179555 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 9 20:32:14.204074 ldconfig[1449]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 9 20:32:14.204209 systemd[1]: Started systemd-timesyncd.service. Feb 9 20:32:14.206368 systemd-resolved[1492]: Positive Trust Anchors: Feb 9 20:32:14.206373 systemd-resolved[1492]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 9 20:32:14.206391 systemd-resolved[1492]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 9 20:32:14.210631 systemd-resolved[1492]: Using system hostname 'ci-3510.3.2-a-45f40c263c'. Feb 9 20:32:14.212658 systemd[1]: Finished ldconfig.service. Feb 9 20:32:14.219489 systemd[1]: Reached target time-set.target. Feb 9 20:32:14.228113 systemd[1]: Starting systemd-update-done.service... Feb 9 20:32:14.234608 systemd[1]: Finished systemd-update-done.service. Feb 9 20:32:14.541524 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 9 20:32:14.568372 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Feb 9 20:32:14.569934 systemd-networkd[1418]: enp1s0f0np0: Configuring with /etc/systemd/network/10-b8:59:9f:de:84:f8.network. Feb 9 20:32:14.570565 systemd[1]: Started systemd-resolved.service. Feb 9 20:32:14.578570 systemd[1]: Reached target network.target. Feb 9 20:32:14.594349 systemd[1]: Reached target nss-lookup.target. Feb 9 20:32:14.609341 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 9 20:32:14.617455 systemd[1]: Reached target sysinit.target. Feb 9 20:32:14.625551 systemd[1]: Started motdgen.path. Feb 9 20:32:14.632434 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Feb 9 20:32:14.642587 systemd[1]: Started logrotate.timer. Feb 9 20:32:14.649450 systemd[1]: Started mdadm.timer. Feb 9 20:32:14.656413 systemd[1]: Started systemd-tmpfiles-clean.timer. Feb 9 20:32:14.664415 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 9 20:32:14.664429 systemd[1]: Reached target paths.target. Feb 9 20:32:14.671513 systemd[1]: Reached target timers.target. Feb 9 20:32:14.679664 systemd[1]: Listening on dbus.socket. Feb 9 20:32:14.688195 systemd[1]: Starting docker.socket... Feb 9 20:32:14.697460 systemd[1]: Listening on sshd.socket. Feb 9 20:32:14.704517 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 9 20:32:14.704834 systemd[1]: Listening on docker.socket. Feb 9 20:32:14.712538 systemd[1]: Reached target sockets.target. Feb 9 20:32:14.736509 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 9 20:32:14.745115 systemd[1]: Reached target basic.target. Feb 9 20:32:14.752585 systemd[1]: System is tainted: cgroupsv1 Feb 9 20:32:14.752610 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 9 20:32:14.752623 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 9 20:32:14.753167 systemd[1]: Starting containerd.service... Feb 9 20:32:14.760985 systemd[1]: Starting coreos-metadata-sshkeys@core.service... Feb 9 20:32:14.770054 systemd[1]: Starting coreos-metadata.service... Feb 9 20:32:14.776951 systemd[1]: Starting dbus.service... Feb 9 20:32:14.783037 systemd[1]: Starting enable-oem-cloudinit.service... Feb 9 20:32:14.787855 jq[1512]: false Feb 9 20:32:14.789995 systemd[1]: Starting extend-filesystems.service... Feb 9 20:32:14.790312 coreos-metadata[1505]: Feb 09 20:32:14.790 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 9 20:32:14.794188 coreos-metadata[1505]: Feb 09 20:32:14.794 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Temporary failure in name resolution Feb 9 20:32:14.795794 dbus-daemon[1511]: [system] SELinux support is enabled Feb 9 20:32:14.797361 extend-filesystems[1514]: Found sda Feb 9 20:32:14.884225 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 9 20:32:14.884334 kernel: EXT4-fs (sda9): resizing filesystem from 553472 to 116605649 blocks Feb 9 20:32:14.884355 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Feb 9 20:32:14.884398 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): bond0: link becomes ready Feb 9 20:32:14.884432 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Feb 9 20:32:14.871563 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Feb 9 20:32:14.884505 coreos-metadata[1508]: Feb 09 20:32:14.802 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 9 20:32:14.884505 coreos-metadata[1508]: Feb 09 20:32:14.803 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Temporary failure in name resolution Feb 9 20:32:14.884642 extend-filesystems[1514]: Found sda1 Feb 9 20:32:14.884642 extend-filesystems[1514]: Found sda2 Feb 9 20:32:14.884642 extend-filesystems[1514]: Found sda3 Feb 9 20:32:14.884642 extend-filesystems[1514]: Found usr Feb 9 20:32:14.884642 extend-filesystems[1514]: Found sda4 Feb 9 20:32:14.884642 extend-filesystems[1514]: Found sda6 Feb 9 20:32:14.884642 extend-filesystems[1514]: Found sda7 Feb 9 20:32:14.884642 extend-filesystems[1514]: Found sda9 Feb 9 20:32:14.884642 extend-filesystems[1514]: Checking size of /dev/sda9 Feb 9 20:32:14.884642 extend-filesystems[1514]: Resized partition /dev/sda9 Feb 9 20:32:15.037401 kernel: bond0: active interface up! Feb 9 20:32:15.037423 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 25000 Mbps full duplex Feb 9 20:32:14.872289 systemd[1]: Starting motdgen.service... Feb 9 20:32:15.037531 extend-filesystems[1522]: resize2fs 1.46.5 (30-Dec-2021) Feb 9 20:32:15.080575 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 9 20:32:15.080596 kernel: bond0: (slave enp1s0f1np1): invalid new link 1 on slave Feb 9 20:32:14.873018 systemd-networkd[1418]: bond0: Link UP Feb 9 20:32:14.873282 systemd-networkd[1418]: enp1s0f1np1: Link UP Feb 9 20:32:14.873472 systemd-networkd[1418]: enp1s0f1np1: Gained carrier Feb 9 20:32:14.874435 systemd-networkd[1418]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-b8:59:9f:de:84:f8.network. Feb 9 20:32:14.914606 systemd[1]: Starting prepare-cni-plugins.service... Feb 9 20:32:14.943066 systemd[1]: Starting prepare-critools.service... Feb 9 20:32:14.963983 systemd[1]: Starting prepare-helm.service... Feb 9 20:32:14.982043 systemd[1]: Starting ssh-key-proc-cmdline.service... Feb 9 20:32:15.001185 systemd[1]: Starting sshd-keygen.service... Feb 9 20:32:15.016247 systemd[1]: Starting systemd-logind.service... Feb 9 20:32:15.029446 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 9 20:32:15.030548 systemd[1]: Starting tcsd.service... Feb 9 20:32:15.031208 systemd-networkd[1418]: enp1s0f0np0: Link UP Feb 9 20:32:15.031392 systemd-networkd[1418]: bond0: Gained carrier Feb 9 20:32:15.031508 systemd-networkd[1418]: enp1s0f0np0: Gained carrier Feb 9 20:32:15.031511 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:15.051764 systemd[1]: Starting update-engine.service... Feb 9 20:32:15.064512 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:15.064552 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:15.064798 systemd-networkd[1418]: enp1s0f1np1: Link DOWN Feb 9 20:32:15.064800 systemd-networkd[1418]: enp1s0f1np1: Lost carrier Feb 9 20:32:15.081186 systemd-logind[1548]: Watching system buttons on /dev/input/event3 (Power Button) Feb 9 20:32:15.081196 systemd-logind[1548]: Watching system buttons on /dev/input/event2 (Sleep Button) Feb 9 20:32:15.081205 systemd-logind[1548]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Feb 9 20:32:15.081302 systemd-logind[1548]: New seat seat0. Feb 9 20:32:15.092554 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:15.092751 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:15.094844 update_engine[1551]: I0209 20:32:15.094372 1551 main.cc:92] Flatcar Update Engine starting Feb 9 20:32:15.096086 systemd[1]: Starting update-ssh-keys-after-ignition.service... Feb 9 20:32:15.097600 update_engine[1551]: I0209 20:32:15.097561 1551 update_check_scheduler.cc:74] Next update check in 11m38s Feb 9 20:32:15.097740 jq[1552]: true Feb 9 20:32:15.104731 systemd[1]: Started dbus.service. Feb 9 20:32:15.113742 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 9 20:32:15.113879 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Feb 9 20:32:15.114052 systemd[1]: motdgen.service: Deactivated successfully. Feb 9 20:32:15.114157 systemd[1]: Finished motdgen.service. Feb 9 20:32:15.121470 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 9 20:32:15.121586 systemd[1]: Finished ssh-key-proc-cmdline.service. Feb 9 20:32:15.125836 tar[1556]: ./ Feb 9 20:32:15.125836 tar[1556]: ./macvlan Feb 9 20:32:15.132372 jq[1562]: true Feb 9 20:32:15.133999 tar[1558]: linux-amd64/helm Feb 9 20:32:15.134457 dbus-daemon[1511]: [system] Successfully activated service 'org.freedesktop.systemd1' Feb 9 20:32:15.135906 tar[1557]: crictl Feb 9 20:32:15.137625 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Feb 9 20:32:15.137839 systemd[1]: Condition check resulted in tcsd.service being skipped. Feb 9 20:32:15.142847 env[1563]: time="2024-02-09T20:32:15.142816991Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Feb 9 20:32:15.146459 systemd[1]: Started update-engine.service. Feb 9 20:32:15.148678 tar[1556]: ./static Feb 9 20:32:15.151770 env[1563]: time="2024-02-09T20:32:15.151724601Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 9 20:32:15.153440 env[1563]: time="2024-02-09T20:32:15.153385082Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 9 20:32:15.154059 env[1563]: time="2024-02-09T20:32:15.154016637Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.148-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 9 20:32:15.154059 env[1563]: time="2024-02-09T20:32:15.154032052Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 9 20:32:15.154216 env[1563]: time="2024-02-09T20:32:15.154201491Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 9 20:32:15.154244 env[1563]: time="2024-02-09T20:32:15.154215531Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 9 20:32:15.154244 env[1563]: time="2024-02-09T20:32:15.154223781Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Feb 9 20:32:15.154244 env[1563]: time="2024-02-09T20:32:15.154229360Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 9 20:32:15.154453 systemd[1]: Started systemd-logind.service. Feb 9 20:32:15.155124 env[1563]: time="2024-02-09T20:32:15.155115073Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 9 20:32:15.155260 env[1563]: time="2024-02-09T20:32:15.155251824Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 9 20:32:15.155390 env[1563]: time="2024-02-09T20:32:15.155342481Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 9 20:32:15.155390 env[1563]: time="2024-02-09T20:32:15.155353977Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 9 20:32:15.156977 env[1563]: time="2024-02-09T20:32:15.156961414Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Feb 9 20:32:15.156977 env[1563]: time="2024-02-09T20:32:15.156969923Z" level=info msg="metadata content store policy set" policy=shared Feb 9 20:32:15.164552 systemd[1]: Started locksmithd.service. Feb 9 20:32:15.166728 bash[1591]: Updated "/home/core/.ssh/authorized_keys" Feb 9 20:32:15.171517 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 9 20:32:15.171648 systemd[1]: Reached target system-config.target. Feb 9 20:32:15.172613 tar[1556]: ./vlan Feb 9 20:32:15.173898 env[1563]: time="2024-02-09T20:32:15.173844906Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 9 20:32:15.173898 env[1563]: time="2024-02-09T20:32:15.173870416Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 9 20:32:15.173898 env[1563]: time="2024-02-09T20:32:15.173880578Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 9 20:32:15.173962 env[1563]: time="2024-02-09T20:32:15.173908465Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.173962 env[1563]: time="2024-02-09T20:32:15.173920382Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.173962 env[1563]: time="2024-02-09T20:32:15.173928766Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.173962 env[1563]: time="2024-02-09T20:32:15.173935685Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.173962 env[1563]: time="2024-02-09T20:32:15.173943710Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.173962 env[1563]: time="2024-02-09T20:32:15.173951060Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.173962 env[1563]: time="2024-02-09T20:32:15.173958571Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.174066 env[1563]: time="2024-02-09T20:32:15.173970236Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.174066 env[1563]: time="2024-02-09T20:32:15.173982928Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 9 20:32:15.174066 env[1563]: time="2024-02-09T20:32:15.174041037Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 9 20:32:15.174112 env[1563]: time="2024-02-09T20:32:15.174090632Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 9 20:32:15.174275 env[1563]: time="2024-02-09T20:32:15.174267934Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 9 20:32:15.174295 env[1563]: time="2024-02-09T20:32:15.174282904Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174312 env[1563]: time="2024-02-09T20:32:15.174295070Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 9 20:32:15.174328 env[1563]: time="2024-02-09T20:32:15.174321211Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174351 env[1563]: time="2024-02-09T20:32:15.174328727Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174351 env[1563]: time="2024-02-09T20:32:15.174336053Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174351 env[1563]: time="2024-02-09T20:32:15.174348478Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174396 env[1563]: time="2024-02-09T20:32:15.174355812Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174396 env[1563]: time="2024-02-09T20:32:15.174362578Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174396 env[1563]: time="2024-02-09T20:32:15.174368650Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174396 env[1563]: time="2024-02-09T20:32:15.174374665Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174396 env[1563]: time="2024-02-09T20:32:15.174381406Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 9 20:32:15.174469 env[1563]: time="2024-02-09T20:32:15.174442346Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174469 env[1563]: time="2024-02-09T20:32:15.174450926Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174469 env[1563]: time="2024-02-09T20:32:15.174458309Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174469 env[1563]: time="2024-02-09T20:32:15.174465186Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 9 20:32:15.174532 env[1563]: time="2024-02-09T20:32:15.174474000Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Feb 9 20:32:15.174532 env[1563]: time="2024-02-09T20:32:15.174480069Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 9 20:32:15.174532 env[1563]: time="2024-02-09T20:32:15.174489869Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Feb 9 20:32:15.174532 env[1563]: time="2024-02-09T20:32:15.174512228Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 9 20:32:15.174711 env[1563]: time="2024-02-09T20:32:15.174643308Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 9 20:32:15.174711 env[1563]: time="2024-02-09T20:32:15.174689172Z" level=info msg="Connect containerd service" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.174725238Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.174997818Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175089941Z" level=info msg="Start subscribing containerd event" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175123749Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175134229Z" level=info msg="Start recovering state" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175151091Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175183552Z" level=info msg="containerd successfully booted in 0.032733s" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175184563Z" level=info msg="Start event monitor" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175195049Z" level=info msg="Start snapshots syncer" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175203984Z" level=info msg="Start cni network conf syncer for default" Feb 9 20:32:15.177582 env[1563]: time="2024-02-09T20:32:15.175210398Z" level=info msg="Start streaming server" Feb 9 20:32:15.179486 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 9 20:32:15.179577 systemd[1]: Reached target user-config.target. Feb 9 20:32:15.189450 systemd[1]: Started containerd.service. Feb 9 20:32:15.193934 tar[1556]: ./portmap Feb 9 20:32:15.196694 systemd[1]: Finished update-ssh-keys-after-ignition.service. Feb 9 20:32:15.214034 tar[1556]: ./host-local Feb 9 20:32:15.231552 tar[1556]: ./vrf Feb 9 20:32:15.233082 locksmithd[1598]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 9 20:32:15.254363 tar[1556]: ./bridge Feb 9 20:32:15.277257 tar[1556]: ./tuning Feb 9 20:32:15.290348 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 9 20:32:15.290500 kernel: bond0: (slave enp1s0f1np1): link status up again after 200 ms Feb 9 20:32:15.295561 tar[1556]: ./firewall Feb 9 20:32:15.309375 kernel: bond0: (slave enp1s0f1np1): speed changed to 0 on port 1 Feb 9 20:32:15.319194 tar[1556]: ./host-device Feb 9 20:32:15.326376 kernel: bond0: (slave enp1s0f1np1): link status up again after 200 ms Feb 9 20:32:15.327730 systemd-networkd[1418]: enp1s0f1np1: Link UP Feb 9 20:32:15.327734 systemd-networkd[1418]: enp1s0f1np1: Gained carrier Feb 9 20:32:15.339823 tar[1556]: ./sbr Feb 9 20:32:15.358729 tar[1556]: ./loopback Feb 9 20:32:15.364346 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Feb 9 20:32:15.364379 kernel: EXT4-fs (sda9): resized filesystem to 116605649 Feb 9 20:32:15.376612 tar[1556]: ./dhcp Feb 9 20:32:15.390644 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:15.390700 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:15.390827 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:15.405894 extend-filesystems[1522]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Feb 9 20:32:15.405894 extend-filesystems[1522]: old_desc_blocks = 1, new_desc_blocks = 56 Feb 9 20:32:15.405894 extend-filesystems[1522]: The filesystem on /dev/sda9 is now 116605649 (4k) blocks long. Feb 9 20:32:15.442407 extend-filesystems[1514]: Resized filesystem in /dev/sda9 Feb 9 20:32:15.442407 extend-filesystems[1514]: Found sdb Feb 9 20:32:15.406385 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 9 20:32:15.471498 tar[1556]: ./ptp Feb 9 20:32:15.471547 tar[1558]: linux-amd64/LICENSE Feb 9 20:32:15.471547 tar[1558]: linux-amd64/README.md Feb 9 20:32:15.406515 systemd[1]: Finished extend-filesystems.service. Feb 9 20:32:15.421749 systemd[1]: Finished prepare-critools.service. Feb 9 20:32:15.452119 systemd[1]: Finished prepare-helm.service. Feb 9 20:32:15.472346 tar[1556]: ./ipvlan Feb 9 20:32:15.492726 tar[1556]: ./bandwidth Feb 9 20:32:15.517484 systemd[1]: Finished prepare-cni-plugins.service. Feb 9 20:32:15.557238 sshd_keygen[1547]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 9 20:32:15.568371 systemd[1]: Finished sshd-keygen.service. Feb 9 20:32:15.576322 systemd[1]: Starting issuegen.service... Feb 9 20:32:15.583663 systemd[1]: issuegen.service: Deactivated successfully. Feb 9 20:32:15.583765 systemd[1]: Finished issuegen.service. Feb 9 20:32:15.591217 systemd[1]: Starting systemd-user-sessions.service... Feb 9 20:32:15.599692 systemd[1]: Finished systemd-user-sessions.service. Feb 9 20:32:15.608085 systemd[1]: Started getty@tty1.service. Feb 9 20:32:15.615036 systemd[1]: Started serial-getty@ttyS1.service. Feb 9 20:32:15.623519 systemd[1]: Reached target getty.target. Feb 9 20:32:15.794366 coreos-metadata[1505]: Feb 09 20:32:15.794 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 9 20:32:15.803740 coreos-metadata[1508]: Feb 09 20:32:15.803 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 9 20:32:16.336479 systemd-networkd[1418]: bond0: Gained IPv6LL Feb 9 20:32:16.336754 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:16.465443 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:16.465919 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:18.642555 kernel: mlx5_core 0000:01:00.0: lag map port 1:1 port 2:2 shared_fdb:0 Feb 9 20:32:20.637747 login[1640]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 9 20:32:20.644354 login[1639]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 9 20:32:20.645846 systemd-logind[1548]: New session 1 of user core. Feb 9 20:32:20.646431 systemd[1]: Created slice user-500.slice. Feb 9 20:32:20.646941 systemd[1]: Starting user-runtime-dir@500.service... Feb 9 20:32:20.648062 systemd-logind[1548]: New session 2 of user core. Feb 9 20:32:20.652667 systemd[1]: Finished user-runtime-dir@500.service. Feb 9 20:32:20.653351 systemd[1]: Starting user@500.service... Feb 9 20:32:20.655397 (systemd)[1646]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:20.732216 systemd[1646]: Queued start job for default target default.target. Feb 9 20:32:20.732318 systemd[1646]: Reached target paths.target. Feb 9 20:32:20.732328 systemd[1646]: Reached target sockets.target. Feb 9 20:32:20.732336 systemd[1646]: Reached target timers.target. Feb 9 20:32:20.732346 systemd[1646]: Reached target basic.target. Feb 9 20:32:20.732365 systemd[1646]: Reached target default.target. Feb 9 20:32:20.732378 systemd[1646]: Startup finished in 73ms. Feb 9 20:32:20.732443 systemd[1]: Started user@500.service. Feb 9 20:32:20.733041 systemd[1]: Started session-1.scope. Feb 9 20:32:20.733419 systemd[1]: Started session-2.scope. Feb 9 20:32:21.969684 coreos-metadata[1505]: Feb 09 20:32:21.969 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 9 20:32:21.970536 coreos-metadata[1508]: Feb 09 20:32:21.969 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 9 20:32:22.871406 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:2 port 2:2 Feb 9 20:32:22.871570 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:1 port 2:2 Feb 9 20:32:23.727165 systemd[1]: Created slice system-sshd.slice. Feb 9 20:32:23.727812 systemd[1]: Started sshd@0-86.109.11.101:22-139.178.89.65:55304.service. Feb 9 20:32:23.775325 sshd[1668]: Accepted publickey for core from 139.178.89.65 port 55304 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:23.776738 sshd[1668]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:23.781873 systemd-logind[1548]: New session 3 of user core. Feb 9 20:32:23.783377 systemd[1]: Started session-3.scope. Feb 9 20:32:23.834437 systemd[1]: Started sshd@1-86.109.11.101:22-139.178.89.65:55306.service. Feb 9 20:32:23.871545 sshd[1673]: Accepted publickey for core from 139.178.89.65 port 55306 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:23.872194 sshd[1673]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:23.874614 systemd-logind[1548]: New session 4 of user core. Feb 9 20:32:23.875095 systemd[1]: Started session-4.scope. Feb 9 20:32:23.926270 sshd[1673]: pam_unix(sshd:session): session closed for user core Feb 9 20:32:23.928739 systemd[1]: Started sshd@2-86.109.11.101:22-139.178.89.65:55318.service. Feb 9 20:32:23.929391 systemd[1]: sshd@1-86.109.11.101:22-139.178.89.65:55306.service: Deactivated successfully. Feb 9 20:32:23.930287 systemd-logind[1548]: Session 4 logged out. Waiting for processes to exit. Feb 9 20:32:23.930424 systemd[1]: session-4.scope: Deactivated successfully. Feb 9 20:32:23.931433 systemd-logind[1548]: Removed session 4. Feb 9 20:32:23.967513 sshd[1679]: Accepted publickey for core from 139.178.89.65 port 55318 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:23.968660 sshd[1679]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:23.969683 coreos-metadata[1505]: Feb 09 20:32:23.969 INFO Fetching https://metadata.packet.net/metadata: Attempt #3 Feb 9 20:32:23.969940 coreos-metadata[1508]: Feb 09 20:32:23.969 INFO Fetching https://metadata.packet.net/metadata: Attempt #3 Feb 9 20:32:23.972333 systemd-logind[1548]: New session 5 of user core. Feb 9 20:32:23.973323 systemd[1]: Started session-5.scope. Feb 9 20:32:23.989879 coreos-metadata[1505]: Feb 09 20:32:23.989 INFO Fetch successful Feb 9 20:32:23.990075 coreos-metadata[1508]: Feb 09 20:32:23.990 INFO Fetch successful Feb 9 20:32:24.014601 systemd[1]: Finished coreos-metadata.service. Feb 9 20:32:24.015438 unknown[1505]: wrote ssh authorized keys file for user: core Feb 9 20:32:24.015493 systemd[1]: Started packet-phone-home.service. Feb 9 20:32:24.020276 curl[1689]: % Total % Received % Xferd Average Speed Time Time Time Current Feb 9 20:32:24.020482 curl[1689]: Dload Upload Total Spent Left Speed Feb 9 20:32:24.023102 sshd[1679]: pam_unix(sshd:session): session closed for user core Feb 9 20:32:24.024336 systemd[1]: sshd@2-86.109.11.101:22-139.178.89.65:55318.service: Deactivated successfully. Feb 9 20:32:24.024894 systemd-logind[1548]: Session 5 logged out. Waiting for processes to exit. Feb 9 20:32:24.024926 systemd[1]: session-5.scope: Deactivated successfully. Feb 9 20:32:24.025336 systemd-logind[1548]: Removed session 5. Feb 9 20:32:24.025667 update-ssh-keys[1691]: Updated "/home/core/.ssh/authorized_keys" Feb 9 20:32:24.025944 systemd[1]: Finished coreos-metadata-sshkeys@core.service. Feb 9 20:32:24.026145 systemd[1]: Reached target multi-user.target. Feb 9 20:32:24.026921 systemd[1]: Starting systemd-update-utmp-runlevel.service... Feb 9 20:32:24.030581 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Feb 9 20:32:24.030684 systemd[1]: Finished systemd-update-utmp-runlevel.service. Feb 9 20:32:24.030807 systemd[1]: Startup finished in 35.172s (kernel) + 15.930s (userspace) = 51.103s. Feb 9 20:32:24.212703 curl[1689]: \u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 Feb 9 20:32:24.215025 systemd[1]: packet-phone-home.service: Deactivated successfully. Feb 9 20:32:34.030075 systemd[1]: Started sshd@3-86.109.11.101:22-139.178.89.65:49802.service. Feb 9 20:32:34.065732 sshd[1700]: Accepted publickey for core from 139.178.89.65 port 49802 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:34.066585 sshd[1700]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:34.069535 systemd-logind[1548]: New session 6 of user core. Feb 9 20:32:34.070137 systemd[1]: Started session-6.scope. Feb 9 20:32:34.124694 sshd[1700]: pam_unix(sshd:session): session closed for user core Feb 9 20:32:34.126050 systemd[1]: Started sshd@4-86.109.11.101:22-139.178.89.65:49806.service. Feb 9 20:32:34.126415 systemd[1]: sshd@3-86.109.11.101:22-139.178.89.65:49802.service: Deactivated successfully. Feb 9 20:32:34.127010 systemd[1]: session-6.scope: Deactivated successfully. Feb 9 20:32:34.127022 systemd-logind[1548]: Session 6 logged out. Waiting for processes to exit. Feb 9 20:32:34.127396 systemd-logind[1548]: Removed session 6. Feb 9 20:32:34.161868 sshd[1705]: Accepted publickey for core from 139.178.89.65 port 49806 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:34.162713 sshd[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:34.165648 systemd-logind[1548]: New session 7 of user core. Feb 9 20:32:34.166246 systemd[1]: Started session-7.scope. Feb 9 20:32:34.218474 sshd[1705]: pam_unix(sshd:session): session closed for user core Feb 9 20:32:34.219770 systemd[1]: Started sshd@5-86.109.11.101:22-139.178.89.65:49810.service. Feb 9 20:32:34.220016 systemd[1]: sshd@4-86.109.11.101:22-139.178.89.65:49806.service: Deactivated successfully. Feb 9 20:32:34.220527 systemd-logind[1548]: Session 7 logged out. Waiting for processes to exit. Feb 9 20:32:34.220571 systemd[1]: session-7.scope: Deactivated successfully. Feb 9 20:32:34.221022 systemd-logind[1548]: Removed session 7. Feb 9 20:32:34.255723 sshd[1712]: Accepted publickey for core from 139.178.89.65 port 49810 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:34.256519 sshd[1712]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:34.259323 systemd-logind[1548]: New session 8 of user core. Feb 9 20:32:34.259882 systemd[1]: Started session-8.scope. Feb 9 20:32:34.313150 sshd[1712]: pam_unix(sshd:session): session closed for user core Feb 9 20:32:34.314505 systemd[1]: Started sshd@6-86.109.11.101:22-139.178.89.65:49818.service. Feb 9 20:32:34.314796 systemd[1]: sshd@5-86.109.11.101:22-139.178.89.65:49810.service: Deactivated successfully. Feb 9 20:32:34.315259 systemd-logind[1548]: Session 8 logged out. Waiting for processes to exit. Feb 9 20:32:34.315278 systemd[1]: session-8.scope: Deactivated successfully. Feb 9 20:32:34.315856 systemd-logind[1548]: Removed session 8. Feb 9 20:32:34.358825 sshd[1720]: Accepted publickey for core from 139.178.89.65 port 49818 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:34.359478 sshd[1720]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:34.361707 systemd-logind[1548]: New session 9 of user core. Feb 9 20:32:34.362103 systemd[1]: Started session-9.scope. Feb 9 20:32:34.438078 sudo[1725]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 9 20:32:34.438684 sudo[1725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 9 20:32:34.463943 dbus-daemon[1511]: \xd0m\xa69\xd7U: received setenforce notice (enforcing=-855126240) Feb 9 20:32:34.468970 sudo[1725]: pam_unix(sudo:session): session closed for user root Feb 9 20:32:34.474335 sshd[1720]: pam_unix(sshd:session): session closed for user core Feb 9 20:32:34.480555 systemd[1]: Started sshd@7-86.109.11.101:22-139.178.89.65:49824.service. Feb 9 20:32:34.482059 systemd[1]: sshd@6-86.109.11.101:22-139.178.89.65:49818.service: Deactivated successfully. Feb 9 20:32:34.484480 systemd-logind[1548]: Session 9 logged out. Waiting for processes to exit. Feb 9 20:32:34.484595 systemd[1]: session-9.scope: Deactivated successfully. Feb 9 20:32:34.487171 systemd-logind[1548]: Removed session 9. Feb 9 20:32:34.589354 sshd[1727]: Accepted publickey for core from 139.178.89.65 port 49824 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:34.591148 sshd[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:34.596997 systemd-logind[1548]: New session 10 of user core. Feb 9 20:32:34.598168 systemd[1]: Started session-10.scope. Feb 9 20:32:34.652913 sudo[1734]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 9 20:32:34.653126 sudo[1734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 9 20:32:34.656452 sudo[1734]: pam_unix(sudo:session): session closed for user root Feb 9 20:32:34.662127 sudo[1733]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Feb 9 20:32:34.662428 sudo[1733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 9 20:32:34.681848 systemd[1]: Stopping audit-rules.service... Feb 9 20:32:34.683000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 9 20:32:34.685242 auditctl[1737]: No rules Feb 9 20:32:34.686081 systemd[1]: audit-rules.service: Deactivated successfully. Feb 9 20:32:34.686680 systemd[1]: Stopped audit-rules.service. Feb 9 20:32:34.690678 kernel: kauditd_printk_skb: 80 callbacks suppressed Feb 9 20:32:34.690843 kernel: audit: type=1305 audit(1707510754.683:132): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 9 20:32:34.690749 systemd[1]: Starting audit-rules.service... Feb 9 20:32:34.683000 audit[1737]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffbcab7210 a2=420 a3=0 items=0 ppid=1 pid=1737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:34.726327 augenrules[1755]: No rules Feb 9 20:32:34.726968 systemd[1]: Finished audit-rules.service. Feb 9 20:32:34.727804 sudo[1733]: pam_unix(sudo:session): session closed for user root Feb 9 20:32:34.729065 sshd[1727]: pam_unix(sshd:session): session closed for user core Feb 9 20:32:34.731172 systemd[1]: sshd@7-86.109.11.101:22-139.178.89.65:49824.service: Deactivated successfully. Feb 9 20:32:34.732140 systemd-logind[1548]: Session 10 logged out. Waiting for processes to exit. Feb 9 20:32:34.732146 systemd[1]: session-10.scope: Deactivated successfully. Feb 9 20:32:34.733034 systemd-logind[1548]: Removed session 10. Feb 9 20:32:34.737890 kernel: audit: type=1300 audit(1707510754.683:132): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffbcab7210 a2=420 a3=0 items=0 ppid=1 pid=1737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:34.737933 kernel: audit: type=1327 audit(1707510754.683:132): proctitle=2F7362696E2F617564697463746C002D44 Feb 9 20:32:34.683000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Feb 9 20:32:34.685000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.748950 systemd[1]: Started sshd@8-86.109.11.101:22-139.178.89.65:49828.service. Feb 9 20:32:34.769947 kernel: audit: type=1131 audit(1707510754.685:133): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.770006 kernel: audit: type=1130 audit(1707510754.725:134): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.725000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.792417 kernel: audit: type=1106 audit(1707510754.726:135): pid=1733 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.726000 audit[1733]: USER_END pid=1733 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.818499 kernel: audit: type=1104 audit(1707510754.726:136): pid=1733 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.726000 audit[1733]: CRED_DISP pid=1733 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.842136 kernel: audit: type=1106 audit(1707510754.728:137): pid=1727 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:32:34.728000 audit[1727]: USER_END pid=1727 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:32:34.874266 kernel: audit: type=1104 audit(1707510754.728:138): pid=1727 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:32:34.728000 audit[1727]: CRED_DISP pid=1727 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:32:34.900146 kernel: audit: type=1131 audit(1707510754.730:139): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-86.109.11.101:22-139.178.89.65:49824 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.730000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-86.109.11.101:22-139.178.89.65:49824 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.747000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-86.109.11.101:22-139.178.89.65:49828 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.933000 audit[1762]: USER_ACCT pid=1762 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:32:34.934358 sshd[1762]: Accepted publickey for core from 139.178.89.65 port 49828 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:32:34.933000 audit[1762]: CRED_ACQ pid=1762 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:32:34.933000 audit[1762]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff15f834b0 a2=3 a3=0 items=0 ppid=1 pid=1762 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:34.933000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:32:34.934943 sshd[1762]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:32:34.937258 systemd-logind[1548]: New session 11 of user core. Feb 9 20:32:34.937681 systemd[1]: Started session-11.scope. Feb 9 20:32:34.938000 audit[1762]: USER_START pid=1762 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:32:34.938000 audit[1765]: CRED_ACQ pid=1765 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:32:34.984000 audit[1766]: USER_ACCT pid=1766 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.984000 audit[1766]: CRED_REFR pid=1766 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:32:34.986168 sudo[1766]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 9 20:32:34.986327 sudo[1766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 9 20:32:34.986000 audit[1766]: USER_START pid=1766 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:32:38.997984 systemd[1]: Starting systemd-networkd-wait-online.service... Feb 9 20:32:39.002568 systemd[1]: Finished systemd-networkd-wait-online.service. Feb 9 20:32:39.001000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-wait-online comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:39.002804 systemd[1]: Reached target network-online.target. Feb 9 20:32:39.003632 systemd[1]: Starting docker.service... Feb 9 20:32:39.021360 env[1787]: time="2024-02-09T20:32:39.021328216Z" level=info msg="Starting up" Feb 9 20:32:39.022013 env[1787]: time="2024-02-09T20:32:39.022002826Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 9 20:32:39.022013 env[1787]: time="2024-02-09T20:32:39.022012156Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 9 20:32:39.022058 env[1787]: time="2024-02-09T20:32:39.022023594Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 9 20:32:39.022058 env[1787]: time="2024-02-09T20:32:39.022030155Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 9 20:32:39.023463 env[1787]: time="2024-02-09T20:32:39.023421866Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 9 20:32:39.023463 env[1787]: time="2024-02-09T20:32:39.023431821Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 9 20:32:39.023463 env[1787]: time="2024-02-09T20:32:39.023441206Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 9 20:32:39.023463 env[1787]: time="2024-02-09T20:32:39.023446825Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 9 20:32:39.471710 env[1787]: time="2024-02-09T20:32:39.471662090Z" level=warning msg="Your kernel does not support cgroup blkio weight" Feb 9 20:32:39.471710 env[1787]: time="2024-02-09T20:32:39.471676344Z" level=warning msg="Your kernel does not support cgroup blkio weight_device" Feb 9 20:32:39.471892 env[1787]: time="2024-02-09T20:32:39.471798138Z" level=info msg="Loading containers: start." Feb 9 20:32:39.501000 audit[1834]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1834 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.501000 audit[1834]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffeef514c00 a2=0 a3=7ffeef514bec items=0 ppid=1787 pid=1834 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.501000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Feb 9 20:32:39.501000 audit[1836]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1836 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.501000 audit[1836]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe2e774520 a2=0 a3=7ffe2e77450c items=0 ppid=1787 pid=1836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.501000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Feb 9 20:32:39.502000 audit[1838]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1838 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.502000 audit[1838]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffda249cf50 a2=0 a3=7ffda249cf3c items=0 ppid=1787 pid=1838 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.502000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 9 20:32:39.503000 audit[1840]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1840 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.503000 audit[1840]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff2000a540 a2=0 a3=7fff2000a52c items=0 ppid=1787 pid=1840 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.503000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 9 20:32:39.505000 audit[1842]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_rule pid=1842 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.505000 audit[1842]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd83b9bf10 a2=0 a3=7ffd83b9befc items=0 ppid=1787 pid=1842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.505000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Feb 9 20:32:39.533000 audit[1847]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_rule pid=1847 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.533000 audit[1847]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffde95fb2f0 a2=0 a3=7ffde95fb2dc items=0 ppid=1787 pid=1847 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.533000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Feb 9 20:32:39.536000 audit[1849]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1849 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.536000 audit[1849]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcff0e6650 a2=0 a3=7ffcff0e663c items=0 ppid=1787 pid=1849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.536000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Feb 9 20:32:39.538000 audit[1851]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=1851 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.538000 audit[1851]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffc3761d860 a2=0 a3=7ffc3761d84c items=0 ppid=1787 pid=1851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.538000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Feb 9 20:32:39.539000 audit[1853]: NETFILTER_CFG table=filter:10 family=2 entries=2 op=nft_register_chain pid=1853 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.539000 audit[1853]: SYSCALL arch=c000003e syscall=46 success=yes exit=308 a0=3 a1=7ffd3a451130 a2=0 a3=7ffd3a45111c items=0 ppid=1787 pid=1853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.539000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 9 20:32:39.545000 audit[1857]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_unregister_rule pid=1857 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.545000 audit[1857]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffcf625d960 a2=0 a3=7ffcf625d94c items=0 ppid=1787 pid=1857 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.545000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 9 20:32:39.546000 audit[1858]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1858 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.546000 audit[1858]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffea5a44960 a2=0 a3=7ffea5a4494c items=0 ppid=1787 pid=1858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.546000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 9 20:32:39.560354 kernel: Initializing XFRM netlink socket Feb 9 20:32:39.610624 env[1787]: time="2024-02-09T20:32:39.610604015Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Feb 9 20:32:39.611384 systemd-timesyncd[1494]: Network configuration changed, trying to establish connection. Feb 9 20:32:39.623000 audit[1866]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=1866 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.623000 audit[1866]: SYSCALL arch=c000003e syscall=46 success=yes exit=492 a0=3 a1=7ffdbe74acf0 a2=0 a3=7ffdbe74acdc items=0 ppid=1787 pid=1866 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.623000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Feb 9 20:32:39.646000 audit[1869]: NETFILTER_CFG table=nat:14 family=2 entries=1 op=nft_register_rule pid=1869 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.646000 audit[1869]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7fff91261b80 a2=0 a3=7fff91261b6c items=0 ppid=1787 pid=1869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.646000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Feb 9 20:32:39.649000 audit[1872]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=1872 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.649000 audit[1872]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffd8256b9c0 a2=0 a3=7ffd8256b9ac items=0 ppid=1787 pid=1872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.649000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Feb 9 20:32:39.651000 audit[1874]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=1874 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.651000 audit[1874]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffd4c28d370 a2=0 a3=7ffd4c28d35c items=0 ppid=1787 pid=1874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.651000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Feb 9 20:32:39.653000 audit[1876]: NETFILTER_CFG table=nat:17 family=2 entries=2 op=nft_register_chain pid=1876 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.653000 audit[1876]: SYSCALL arch=c000003e syscall=46 success=yes exit=356 a0=3 a1=7ffeffefa840 a2=0 a3=7ffeffefa82c items=0 ppid=1787 pid=1876 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.653000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Feb 9 20:32:39.655000 audit[1878]: NETFILTER_CFG table=nat:18 family=2 entries=2 op=nft_register_chain pid=1878 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.655000 audit[1878]: SYSCALL arch=c000003e syscall=46 success=yes exit=444 a0=3 a1=7fffca8c8510 a2=0 a3=7fffca8c84fc items=0 ppid=1787 pid=1878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.655000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Feb 9 20:32:39.658000 audit[1880]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=1880 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.658000 audit[1880]: SYSCALL arch=c000003e syscall=46 success=yes exit=304 a0=3 a1=7ffd9374db30 a2=0 a3=7ffd9374db1c items=0 ppid=1787 pid=1880 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.658000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Feb 9 20:32:39.672000 audit[1883]: NETFILTER_CFG table=filter:20 family=2 entries=1 op=nft_register_rule pid=1883 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.672000 audit[1883]: SYSCALL arch=c000003e syscall=46 success=yes exit=508 a0=3 a1=7fffabc8c620 a2=0 a3=7fffabc8c60c items=0 ppid=1787 pid=1883 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.672000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Feb 9 20:32:39.676000 audit[1885]: NETFILTER_CFG table=filter:21 family=2 entries=1 op=nft_register_rule pid=1885 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.676000 audit[1885]: SYSCALL arch=c000003e syscall=46 success=yes exit=240 a0=3 a1=7ffcfe699230 a2=0 a3=7ffcfe69921c items=0 ppid=1787 pid=1885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.676000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 9 20:32:39.680000 audit[1887]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=1887 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.680000 audit[1887]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffdb1f276c0 a2=0 a3=7ffdb1f276ac items=0 ppid=1787 pid=1887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.680000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 9 20:32:39.684000 audit[1889]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=1889 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.686509 systemd-networkd[1418]: docker0: Link UP Feb 9 20:32:39.691722 kernel: kauditd_printk_skb: 75 callbacks suppressed Feb 9 20:32:39.691849 kernel: audit: type=1325 audit(1707510759.684:171): table=filter:23 family=2 entries=1 op=nft_register_rule pid=1889 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.701155 env[1787]: time="2024-02-09T20:32:39.701084983Z" level=info msg="Loading containers: done." Feb 9 20:32:39.684000 audit[1889]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe13445f10 a2=0 a3=7ffe13445efc items=0 ppid=1787 pid=1889 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.826828 kernel: audit: type=1300 audit(1707510759.684:171): arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe13445f10 a2=0 a3=7ffe13445efc items=0 ppid=1787 pid=1889 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.826875 kernel: audit: type=1327 audit(1707510759.684:171): proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Feb 9 20:32:39.684000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Feb 9 20:32:39.827070 env[1787]: time="2024-02-09T20:32:39.827028382Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 9 20:32:39.827167 env[1787]: time="2024-02-09T20:32:39.827123353Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Feb 9 20:32:39.827193 env[1787]: time="2024-02-09T20:32:39.827167886Z" level=info msg="Daemon has completed initialization" Feb 9 20:32:39.833597 systemd[1]: Started docker.service. Feb 9 20:32:39.835847 env[1787]: time="2024-02-09T20:32:39.835803027Z" level=info msg="API listen on /run/docker.sock" Feb 9 20:32:39.846243 systemd[1]: Reloading. Feb 9 20:32:39.879927 /usr/lib/systemd/system-generators/torcx-generator[1943]: time="2024-02-09T20:32:39Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 9 20:32:39.879949 /usr/lib/systemd/system-generators/torcx-generator[1943]: time="2024-02-09T20:32:39Z" level=info msg="torcx already run" Feb 9 20:32:39.882403 kernel: audit: type=1325 audit(1707510759.697:172): table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=1893 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.882436 kernel: audit: type=1300 audit(1707510759.697:172): arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffdef72d4d0 a2=0 a3=7ffdef72d4bc items=0 ppid=1787 pid=1893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.882455 kernel: audit: type=1327 audit(1707510759.697:172): proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 9 20:32:39.882468 kernel: audit: type=1325 audit(1707510759.699:173): table=filter:25 family=2 entries=1 op=nft_register_rule pid=1894 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.882480 kernel: audit: type=1300 audit(1707510759.699:173): arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffcf1947980 a2=0 a3=7ffcf194796c items=0 ppid=1787 pid=1894 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.882493 kernel: audit: type=1327 audit(1707510759.699:173): proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 9 20:32:39.882504 kernel: audit: type=1130 audit(1707510759.832:174): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:39.697000 audit[1893]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=1893 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.697000 audit[1893]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffdef72d4d0 a2=0 a3=7ffdef72d4bc items=0 ppid=1787 pid=1893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.697000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 9 20:32:39.699000 audit[1894]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=1894 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:39.699000 audit[1894]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffcf1947980 a2=0 a3=7ffcf194796c items=0 ppid=1787 pid=1894 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:39.699000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 9 20:32:39.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:39.978740 systemd-timesyncd[1494]: Contacted time server [2606:4700:f1::123]:123 (2.flatcar.pool.ntp.org). Feb 9 20:32:39.978767 systemd-timesyncd[1494]: Initial clock synchronization to Fri 2024-02-09 20:32:40.300008 UTC. Feb 9 20:32:40.390669 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 9 20:32:40.390676 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 9 20:32:40.401888 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 9 20:32:40.455568 systemd[1]: Started kubelet.service. Feb 9 20:32:40.454000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:40.480450 kubelet[2008]: E0209 20:32:40.480357 2008 run.go:74] "command failed" err="failed to validate kubelet flags: the container runtime endpoint address was not specified or empty, use --container-runtime-endpoint to set" Feb 9 20:32:40.481737 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 9 20:32:40.481853 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 9 20:32:40.480000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 9 20:32:41.225423 env[1563]: time="2024-02-09T20:32:41.225273444Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.26.13\"" Feb 9 20:32:41.915953 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount827284711.mount: Deactivated successfully. Feb 9 20:32:44.181967 env[1563]: time="2024-02-09T20:32:44.181909364Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:44.182589 env[1563]: time="2024-02-09T20:32:44.182543732Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:84900298406b2df97ade16b73c49c2b73265ded8735ac19a4e20c2a4ad65853f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:44.184015 env[1563]: time="2024-02-09T20:32:44.183982924Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:44.184858 env[1563]: time="2024-02-09T20:32:44.184811673Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:2f28bed4096abd572a56595ac0304238bdc271dcfe22c650707c09bf97ec16fd,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:44.185795 env[1563]: time="2024-02-09T20:32:44.185751423Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.26.13\" returns image reference \"sha256:84900298406b2df97ade16b73c49c2b73265ded8735ac19a4e20c2a4ad65853f\"" Feb 9 20:32:44.191595 env[1563]: time="2024-02-09T20:32:44.191511752Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.26.13\"" Feb 9 20:32:46.708585 env[1563]: time="2024-02-09T20:32:46.708555097Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:46.709229 env[1563]: time="2024-02-09T20:32:46.709216143Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:921f237b560bdb02300f82d3606635d395b20635512fab10f0191cff42079486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:46.710326 env[1563]: time="2024-02-09T20:32:46.710313102Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:46.711359 env[1563]: time="2024-02-09T20:32:46.711341883Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:fda420c6c15cdd01c4eba3404f0662fe486a9c7f38fa13c741a21334673841a2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:46.711900 env[1563]: time="2024-02-09T20:32:46.711866837Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.26.13\" returns image reference \"sha256:921f237b560bdb02300f82d3606635d395b20635512fab10f0191cff42079486\"" Feb 9 20:32:46.717981 env[1563]: time="2024-02-09T20:32:46.717925066Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.26.13\"" Feb 9 20:32:48.309156 env[1563]: time="2024-02-09T20:32:48.309101959Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:48.309933 env[1563]: time="2024-02-09T20:32:48.309886454Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4fe82b56f06250b6b7eb3d5a879cd2cfabf41cb3e45b24af6059eadbc3b8026e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:48.311217 env[1563]: time="2024-02-09T20:32:48.311177759Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:48.312282 env[1563]: time="2024-02-09T20:32:48.312239693Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:c3c7303ee6d01c8e5a769db28661cf854b55175aa72c67e9b6a7b9d47ac42af3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:48.312772 env[1563]: time="2024-02-09T20:32:48.312729257Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.26.13\" returns image reference \"sha256:4fe82b56f06250b6b7eb3d5a879cd2cfabf41cb3e45b24af6059eadbc3b8026e\"" Feb 9 20:32:48.319547 env[1563]: time="2024-02-09T20:32:48.319512094Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.26.13\"" Feb 9 20:32:49.201567 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2897006103.mount: Deactivated successfully. Feb 9 20:32:49.799447 env[1563]: time="2024-02-09T20:32:49.799419704Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:49.799974 env[1563]: time="2024-02-09T20:32:49.799932711Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5a7325fa2b6e8d712e4a770abb4a5a5852e87b6de8df34552d67853e9bfb9f9f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:49.800758 env[1563]: time="2024-02-09T20:32:49.800717000Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:49.801458 env[1563]: time="2024-02-09T20:32:49.801372765Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:f6e0de32a002b910b9b2e0e8d769e2d7b05208240559c745ce4781082ab15f22,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:49.801772 env[1563]: time="2024-02-09T20:32:49.801730800Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.26.13\" returns image reference \"sha256:5a7325fa2b6e8d712e4a770abb4a5a5852e87b6de8df34552d67853e9bfb9f9f\"" Feb 9 20:32:49.807541 env[1563]: time="2024-02-09T20:32:49.807484715Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Feb 9 20:32:50.325339 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2347648803.mount: Deactivated successfully. Feb 9 20:32:50.326519 env[1563]: time="2024-02-09T20:32:50.326478436Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:50.327184 env[1563]: time="2024-02-09T20:32:50.327142911Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:50.327816 env[1563]: time="2024-02-09T20:32:50.327774923Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:50.328845 env[1563]: time="2024-02-09T20:32:50.328809619Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:50.329105 env[1563]: time="2024-02-09T20:32:50.329053607Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Feb 9 20:32:50.334328 env[1563]: time="2024-02-09T20:32:50.334307735Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.6-0\"" Feb 9 20:32:50.554921 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 9 20:32:50.555482 systemd[1]: Stopped kubelet.service. Feb 9 20:32:50.554000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:50.559318 systemd[1]: Started kubelet.service. Feb 9 20:32:50.582583 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 9 20:32:50.582664 kernel: audit: type=1130 audit(1707510770.554:177): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:50.584159 kubelet[2094]: E0209 20:32:50.584109 2094 run.go:74] "command failed" err="failed to validate kubelet flags: the container runtime endpoint address was not specified or empty, use --container-runtime-endpoint to set" Feb 9 20:32:50.586782 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 9 20:32:50.586870 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 9 20:32:50.554000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:50.713295 kernel: audit: type=1131 audit(1707510770.554:178): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:50.713321 kernel: audit: type=1130 audit(1707510770.558:179): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:50.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:50.586000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 9 20:32:50.843333 kernel: audit: type=1131 audit(1707510770.586:180): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 9 20:32:51.001242 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4294274676.mount: Deactivated successfully. Feb 9 20:32:53.925885 env[1563]: time="2024-02-09T20:32:53.925858615Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.6-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:53.926508 env[1563]: time="2024-02-09T20:32:53.926496691Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:fce326961ae2d51a5f726883fd59d2a8c2ccc3e45d3bb859882db58e422e59e7,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:53.927882 env[1563]: time="2024-02-09T20:32:53.927867886Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.6-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:53.928721 env[1563]: time="2024-02-09T20:32:53.928691910Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:dd75ec974b0a2a6f6bb47001ba09207976e625db898d1b16735528c009cb171c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:53.929091 env[1563]: time="2024-02-09T20:32:53.929064910Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.6-0\" returns image reference \"sha256:fce326961ae2d51a5f726883fd59d2a8c2ccc3e45d3bb859882db58e422e59e7\"" Feb 9 20:32:53.936334 env[1563]: time="2024-02-09T20:32:53.936313527Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.9.3\"" Feb 9 20:32:54.489304 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2409932683.mount: Deactivated successfully. Feb 9 20:32:54.932993 env[1563]: time="2024-02-09T20:32:54.932948899Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.9.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:54.933621 env[1563]: time="2024-02-09T20:32:54.933581621Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:54.934468 env[1563]: time="2024-02-09T20:32:54.934427760Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.9.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:54.935438 env[1563]: time="2024-02-09T20:32:54.935389800Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:54.935832 env[1563]: time="2024-02-09T20:32:54.935772911Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.9.3\" returns image reference \"sha256:5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a\"" Feb 9 20:32:56.094225 systemd[1]: Stopped kubelet.service. Feb 9 20:32:56.092000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:56.103486 systemd[1]: Reloading. Feb 9 20:32:56.133463 /usr/lib/systemd/system-generators/torcx-generator[2252]: time="2024-02-09T20:32:56Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 9 20:32:56.133533 /usr/lib/systemd/system-generators/torcx-generator[2252]: time="2024-02-09T20:32:56Z" level=info msg="torcx already run" Feb 9 20:32:56.092000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:56.161396 kernel: audit: type=1130 audit(1707510776.092:181): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:56.161441 kernel: audit: type=1131 audit(1707510776.092:182): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:56.248365 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 9 20:32:56.248372 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 9 20:32:56.259222 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 9 20:32:56.314300 systemd[1]: Started kubelet.service. Feb 9 20:32:56.313000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:56.336798 kubelet[2318]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 9 20:32:56.336798 kubelet[2318]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 9 20:32:56.336798 kubelet[2318]: I0209 20:32:56.336792 2318 server.go:198] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 9 20:32:56.337618 kubelet[2318]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 9 20:32:56.337618 kubelet[2318]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 9 20:32:56.379404 kernel: audit: type=1130 audit(1707510776.313:183): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:32:56.797016 kubelet[2318]: I0209 20:32:56.796974 2318 server.go:412] "Kubelet version" kubeletVersion="v1.26.5" Feb 9 20:32:56.797016 kubelet[2318]: I0209 20:32:56.796986 2318 server.go:414] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 9 20:32:56.797145 kubelet[2318]: I0209 20:32:56.797105 2318 server.go:836] "Client rotation is on, will bootstrap in background" Feb 9 20:32:56.798521 kubelet[2318]: I0209 20:32:56.798473 2318 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 9 20:32:56.798926 kubelet[2318]: E0209 20:32:56.798888 2318 certificate_manager.go:471] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://86.109.11.101:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:56.819559 kubelet[2318]: I0209 20:32:56.819551 2318 server.go:659] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 9 20:32:56.819743 kubelet[2318]: I0209 20:32:56.819737 2318 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 9 20:32:56.819777 kubelet[2318]: I0209 20:32:56.819773 2318 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:cgroupfs KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:} {Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:} {Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] ExperimentalTopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 9 20:32:56.819836 kubelet[2318]: I0209 20:32:56.819783 2318 topology_manager.go:134] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 9 20:32:56.819836 kubelet[2318]: I0209 20:32:56.819790 2318 container_manager_linux.go:308] "Creating device plugin manager" Feb 9 20:32:56.819872 kubelet[2318]: I0209 20:32:56.819836 2318 state_mem.go:36] "Initialized new in-memory state store" Feb 9 20:32:56.821590 kubelet[2318]: I0209 20:32:56.821552 2318 kubelet.go:398] "Attempting to sync node with API server" Feb 9 20:32:56.821590 kubelet[2318]: I0209 20:32:56.821562 2318 kubelet.go:286] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 9 20:32:56.821590 kubelet[2318]: I0209 20:32:56.821573 2318 kubelet.go:297] "Adding apiserver pod source" Feb 9 20:32:56.821590 kubelet[2318]: I0209 20:32:56.821580 2318 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 9 20:32:56.821904 kubelet[2318]: W0209 20:32:56.821878 2318 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://86.109.11.101:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:56.821904 kubelet[2318]: I0209 20:32:56.821895 2318 kuberuntime_manager.go:244] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 9 20:32:56.821904 kubelet[2318]: E0209 20:32:56.821905 2318 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://86.109.11.101:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:56.821967 kubelet[2318]: W0209 20:32:56.821912 2318 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://86.109.11.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-45f40c263c&limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:56.821967 kubelet[2318]: E0209 20:32:56.821935 2318 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://86.109.11.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-45f40c263c&limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:56.822022 kubelet[2318]: W0209 20:32:56.822013 2318 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 9 20:32:56.822205 kubelet[2318]: I0209 20:32:56.822199 2318 server.go:1186] "Started kubelet" Feb 9 20:32:56.822231 kubelet[2318]: I0209 20:32:56.822225 2318 server.go:161] "Starting to listen" address="0.0.0.0" port=10250 Feb 9 20:32:56.822432 kubelet[2318]: E0209 20:32:56.822423 2318 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 9 20:32:56.822466 kubelet[2318]: E0209 20:32:56.822435 2318 kubelet.go:1386] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 9 20:32:56.822485 kubelet[2318]: E0209 20:32:56.822417 2318 event.go:276] Unable to write event: '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-45f40c263c.17b24c0136a34830", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-45f40c263c", UID:"ci-3510.3.2-a-45f40c263c", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-45f40c263c"}, FirstTimestamp:time.Date(2024, time.February, 9, 20, 32, 56, 822188080, time.Local), LastTimestamp:time.Date(2024, time.February, 9, 20, 32, 56, 822188080, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'Post "https://86.109.11.101:6443/api/v1/namespaces/default/events": dial tcp 86.109.11.101:6443: connect: connection refused'(may retry after sleeping) Feb 9 20:32:56.822771 kubelet[2318]: I0209 20:32:56.822764 2318 server.go:451] "Adding debug handlers to kubelet server" Feb 9 20:32:56.822000 audit[2318]: AVC avc: denied { mac_admin } for pid=2318 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:32:56.822983 kubelet[2318]: I0209 20:32:56.822962 2318 kubelet.go:1341] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 9 20:32:56.823005 kubelet[2318]: I0209 20:32:56.822985 2318 kubelet.go:1345] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 9 20:32:56.823024 kubelet[2318]: I0209 20:32:56.823019 2318 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 9 20:32:56.823050 kubelet[2318]: I0209 20:32:56.823041 2318 volume_manager.go:293] "Starting Kubelet Volume Manager" Feb 9 20:32:56.823079 kubelet[2318]: I0209 20:32:56.823070 2318 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 9 20:32:56.823106 kubelet[2318]: E0209 20:32:56.823087 2318 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-45f40c263c\" not found" Feb 9 20:32:56.823511 kubelet[2318]: E0209 20:32:56.823492 2318 controller.go:146] failed to ensure lease exists, will retry in 200ms, error: Get "https://86.109.11.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-45f40c263c?timeout=10s": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:56.824202 kubelet[2318]: W0209 20:32:56.824163 2318 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://86.109.11.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:56.824251 kubelet[2318]: E0209 20:32:56.824214 2318 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://86.109.11.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:56.886538 kubelet[2318]: I0209 20:32:56.886526 2318 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 9 20:32:56.886538 kubelet[2318]: I0209 20:32:56.886536 2318 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 9 20:32:56.886622 kubelet[2318]: I0209 20:32:56.886544 2318 state_mem.go:36] "Initialized new in-memory state store" Feb 9 20:32:56.887447 kubelet[2318]: I0209 20:32:56.887439 2318 policy_none.go:49] "None policy: Start" Feb 9 20:32:56.887622 kubelet[2318]: I0209 20:32:56.887616 2318 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 9 20:32:56.887653 kubelet[2318]: I0209 20:32:56.887624 2318 state_mem.go:35] "Initializing new in-memory state store" Feb 9 20:32:56.822000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:32:56.889349 kernel: audit: type=1400 audit(1707510776.822:184): avc: denied { mac_admin } for pid=2318 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:32:56.889374 kernel: audit: type=1401 audit(1707510776.822:184): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:32:56.822000 audit[2318]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0012463c0 a1=c0012225a0 a2=c001246390 a3=25 items=0 ppid=1 pid=2318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:56.923721 kubelet[2318]: I0209 20:32:56.923699 2318 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 9 20:32:56.923800 kubelet[2318]: I0209 20:32:56.923765 2318 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 9 20:32:56.923963 kubelet[2318]: I0209 20:32:56.923954 2318 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 9 20:32:56.924508 kubelet[2318]: E0209 20:32:56.924498 2318 eviction_manager.go:261] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-3510.3.2-a-45f40c263c\" not found" Feb 9 20:32:56.925227 kubelet[2318]: I0209 20:32:56.925220 2318 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:56.925400 kubelet[2318]: E0209 20:32:56.925393 2318 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://86.109.11.101:6443/api/v1/nodes\": dial tcp 86.109.11.101:6443: connect: connection refused" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.017765 kernel: audit: type=1300 audit(1707510776.822:184): arch=c000003e syscall=188 success=no exit=-22 a0=c0012463c0 a1=c0012225a0 a2=c001246390 a3=25 items=0 ppid=1 pid=2318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.017792 kernel: audit: type=1327 audit(1707510776.822:184): proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:32:56.822000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:32:57.024160 kubelet[2318]: E0209 20:32:57.024118 2318 controller.go:146] failed to ensure lease exists, will retry in 400ms, error: Get "https://86.109.11.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-45f40c263c?timeout=10s": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:57.111570 kernel: audit: type=1400 audit(1707510776.822:185): avc: denied { mac_admin } for pid=2318 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:32:56.822000 audit[2318]: AVC avc: denied { mac_admin } for pid=2318 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:32:57.126195 kubelet[2318]: I0209 20:32:57.126160 2318 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.126272 kubelet[2318]: E0209 20:32:57.126266 2318 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://86.109.11.101:6443/api/v1/nodes\": dial tcp 86.109.11.101:6443: connect: connection refused" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.176807 kernel: audit: type=1401 audit(1707510776.822:185): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:32:56.822000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:32:56.822000 audit[2318]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c001256480 a1=c0012225b8 a2=c001246450 a3=25 items=0 ppid=1 pid=2318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.307176 kernel: audit: type=1300 audit(1707510776.822:185): arch=c000003e syscall=188 success=no exit=-22 a0=c001256480 a1=c0012225b8 a2=c001246450 a3=25 items=0 ppid=1 pid=2318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:56.822000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:32:56.824000 audit[2342]: NETFILTER_CFG table=mangle:26 family=2 entries=2 op=nft_register_chain pid=2342 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:56.824000 audit[2342]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff4bc5bdc0 a2=0 a3=7fff4bc5bdac items=0 ppid=2318 pid=2342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:56.824000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 9 20:32:56.825000 audit[2343]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_register_chain pid=2343 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:56.825000 audit[2343]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff6287f740 a2=0 a3=7fff6287f72c items=0 ppid=2318 pid=2343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:56.825000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 9 20:32:56.826000 audit[2345]: NETFILTER_CFG table=filter:28 family=2 entries=2 op=nft_register_chain pid=2345 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:56.826000 audit[2345]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe1a036400 a2=0 a3=7ffe1a0363ec items=0 ppid=2318 pid=2345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:56.826000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 9 20:32:56.827000 audit[2347]: NETFILTER_CFG table=filter:29 family=2 entries=2 op=nft_register_chain pid=2347 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:56.827000 audit[2347]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fff4a13d540 a2=0 a3=7fff4a13d52c items=0 ppid=2318 pid=2347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:56.827000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 9 20:32:56.922000 audit[2318]: AVC avc: denied { mac_admin } for pid=2318 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:32:56.922000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:32:56.922000 audit[2318]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0017b2e40 a1=c000cc9d58 a2=c0017b2e10 a3=25 items=0 ppid=1 pid=2318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:56.922000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:32:57.209000 audit[2351]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2351 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.209000 audit[2351]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffdedc71c50 a2=0 a3=7ffdedc71c3c items=0 ppid=2318 pid=2351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.209000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Feb 9 20:32:57.210000 audit[2352]: NETFILTER_CFG table=nat:31 family=2 entries=1 op=nft_register_chain pid=2352 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.210000 audit[2352]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffd13439e0 a2=0 a3=7fffd13439cc items=0 ppid=2318 pid=2352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.210000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D44524F50002D74006E6174 Feb 9 20:32:57.307000 audit[2355]: NETFILTER_CFG table=nat:32 family=2 entries=1 op=nft_register_rule pid=2355 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.307000 audit[2355]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7fff90f49350 a2=0 a3=7fff90f4933c items=0 ppid=2318 pid=2355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.307000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D44524F50002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303038303030 Feb 9 20:32:57.309000 audit[2358]: NETFILTER_CFG table=filter:33 family=2 entries=1 op=nft_register_rule pid=2358 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.309000 audit[2358]: SYSCALL arch=c000003e syscall=46 success=yes exit=664 a0=3 a1=7fffc637f930 a2=0 a3=7fffc637f91c items=0 ppid=2318 pid=2358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.309000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206669726577616C6C20666F722064726F7070696E67206D61726B6564207061636B657473002D6D006D61726B Feb 9 20:32:57.309000 audit[2359]: NETFILTER_CFG table=nat:34 family=2 entries=1 op=nft_register_chain pid=2359 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.309000 audit[2359]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe12f46be0 a2=0 a3=7ffe12f46bcc items=0 ppid=2318 pid=2359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.309000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D4D415351002D74006E6174 Feb 9 20:32:57.310000 audit[2360]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_chain pid=2360 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.310000 audit[2360]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc830be310 a2=0 a3=7ffc830be2fc items=0 ppid=2318 pid=2360 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.310000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 9 20:32:57.311000 audit[2362]: NETFILTER_CFG table=nat:36 family=2 entries=1 op=nft_register_rule pid=2362 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.311000 audit[2362]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffd109c8a10 a2=0 a3=7ffd109c89fc items=0 ppid=2318 pid=2362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.311000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D4D415351002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303034303030 Feb 9 20:32:57.312000 audit[2364]: NETFILTER_CFG table=nat:37 family=2 entries=1 op=nft_register_rule pid=2364 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.312000 audit[2364]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffd0460e210 a2=0 a3=7ffd0460e1fc items=0 ppid=2318 pid=2364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.312000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 9 20:32:57.313000 audit[2366]: NETFILTER_CFG table=nat:38 family=2 entries=1 op=nft_register_rule pid=2366 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.313000 audit[2366]: SYSCALL arch=c000003e syscall=46 success=yes exit=364 a0=3 a1=7ffd6e9767a0 a2=0 a3=7ffd6e97678c items=0 ppid=2318 pid=2366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.313000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D006D61726B0000002D2D6D61726B00307830303030343030302F30783030303034303030002D6A0052455455524E Feb 9 20:32:57.314000 audit[2368]: NETFILTER_CFG table=nat:39 family=2 entries=1 op=nft_register_rule pid=2368 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.314000 audit[2368]: SYSCALL arch=c000003e syscall=46 success=yes exit=220 a0=3 a1=7ffeff847490 a2=0 a3=7ffeff84747c items=0 ppid=2318 pid=2368 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.314000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6A004D41524B002D2D786F722D6D61726B0030783030303034303030 Feb 9 20:32:57.316000 audit[2370]: NETFILTER_CFG table=nat:40 family=2 entries=1 op=nft_register_rule pid=2370 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.316000 audit[2370]: SYSCALL arch=c000003e syscall=46 success=yes exit=540 a0=3 a1=7ffd2edbece0 a2=0 a3=7ffd2edbeccc items=0 ppid=2318 pid=2370 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.316000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732073657276696365207472616666696320726571756972696E6720534E4154002D6A004D415351554552414445 Feb 9 20:32:57.316750 kubelet[2318]: I0209 20:32:57.316711 2318 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 9 20:32:57.316000 audit[2371]: NETFILTER_CFG table=mangle:41 family=10 entries=2 op=nft_register_chain pid=2371 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.316000 audit[2371]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffe1ff6fa60 a2=0 a3=7ffe1ff6fa4c items=0 ppid=2318 pid=2371 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.316000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 9 20:32:57.316000 audit[2372]: NETFILTER_CFG table=mangle:42 family=2 entries=1 op=nft_register_chain pid=2372 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.316000 audit[2372]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe2fe9a2f0 a2=0 a3=7ffe2fe9a2dc items=0 ppid=2318 pid=2372 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.316000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 9 20:32:57.317000 audit[2373]: NETFILTER_CFG table=nat:43 family=10 entries=2 op=nft_register_chain pid=2373 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.317000 audit[2373]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffdd2972930 a2=0 a3=7ffdd297291c items=0 ppid=2318 pid=2373 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.317000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D44524F50002D74006E6174 Feb 9 20:32:57.317000 audit[2374]: NETFILTER_CFG table=nat:44 family=2 entries=1 op=nft_register_chain pid=2374 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.317000 audit[2374]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffc9878530 a2=0 a3=7fffc987851c items=0 ppid=2318 pid=2374 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.317000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 9 20:32:57.317000 audit[2376]: NETFILTER_CFG table=filter:45 family=2 entries=1 op=nft_register_chain pid=2376 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:32:57.317000 audit[2376]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcba362f20 a2=0 a3=7ffcba362f0c items=0 ppid=2318 pid=2376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.317000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 9 20:32:57.317000 audit[2377]: NETFILTER_CFG table=nat:46 family=10 entries=1 op=nft_register_rule pid=2377 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.317000 audit[2377]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffc387cf7c0 a2=0 a3=7ffc387cf7ac items=0 ppid=2318 pid=2377 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.317000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D44524F50002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303038303030 Feb 9 20:32:57.317000 audit[2378]: NETFILTER_CFG table=filter:47 family=10 entries=2 op=nft_register_chain pid=2378 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.317000 audit[2378]: SYSCALL arch=c000003e syscall=46 success=yes exit=132 a0=3 a1=7ffdebdcada0 a2=0 a3=7ffdebdcad8c items=0 ppid=2318 pid=2378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.317000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 9 20:32:57.319000 audit[2380]: NETFILTER_CFG table=filter:48 family=10 entries=1 op=nft_register_rule pid=2380 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.319000 audit[2380]: SYSCALL arch=c000003e syscall=46 success=yes exit=664 a0=3 a1=7fff8a6b3600 a2=0 a3=7fff8a6b35ec items=0 ppid=2318 pid=2380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.319000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206669726577616C6C20666F722064726F7070696E67206D61726B6564207061636B657473002D6D006D61726B Feb 9 20:32:57.320000 audit[2381]: NETFILTER_CFG table=nat:49 family=10 entries=1 op=nft_register_chain pid=2381 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.320000 audit[2381]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff471fb9c0 a2=0 a3=7fff471fb9ac items=0 ppid=2318 pid=2381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.320000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D4D415351002D74006E6174 Feb 9 20:32:57.320000 audit[2382]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2382 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.320000 audit[2382]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffa7c575b0 a2=0 a3=7fffa7c5759c items=0 ppid=2318 pid=2382 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.320000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 9 20:32:57.321000 audit[2384]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_rule pid=2384 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.321000 audit[2384]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7fff92a4fda0 a2=0 a3=7fff92a4fd8c items=0 ppid=2318 pid=2384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.321000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D4D415351002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303034303030 Feb 9 20:32:57.321000 audit[2386]: NETFILTER_CFG table=nat:52 family=10 entries=2 op=nft_register_chain pid=2386 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.321000 audit[2386]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffde5735720 a2=0 a3=7ffde573570c items=0 ppid=2318 pid=2386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.321000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 9 20:32:57.324000 audit[2388]: NETFILTER_CFG table=nat:53 family=10 entries=1 op=nft_register_rule pid=2388 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.324000 audit[2388]: SYSCALL arch=c000003e syscall=46 success=yes exit=364 a0=3 a1=7ffe1443cc50 a2=0 a3=7ffe1443cc3c items=0 ppid=2318 pid=2388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.324000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D006D61726B0000002D2D6D61726B00307830303030343030302F30783030303034303030002D6A0052455455524E Feb 9 20:32:57.325000 audit[2390]: NETFILTER_CFG table=nat:54 family=10 entries=1 op=nft_register_rule pid=2390 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.325000 audit[2390]: SYSCALL arch=c000003e syscall=46 success=yes exit=220 a0=3 a1=7ffd4096e310 a2=0 a3=7ffd4096e2fc items=0 ppid=2318 pid=2390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.325000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6A004D41524B002D2D786F722D6D61726B0030783030303034303030 Feb 9 20:32:57.325000 audit[2392]: NETFILTER_CFG table=nat:55 family=10 entries=1 op=nft_register_rule pid=2392 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.325000 audit[2392]: SYSCALL arch=c000003e syscall=46 success=yes exit=556 a0=3 a1=7ffc2d58ee50 a2=0 a3=7ffc2d58ee3c items=0 ppid=2318 pid=2392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.325000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732073657276696365207472616666696320726571756972696E6720534E4154002D6A004D415351554552414445 Feb 9 20:32:57.327419 kubelet[2318]: I0209 20:32:57.327360 2318 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 9 20:32:57.327419 kubelet[2318]: I0209 20:32:57.327369 2318 status_manager.go:176] "Starting to sync pod status with apiserver" Feb 9 20:32:57.327419 kubelet[2318]: I0209 20:32:57.327378 2318 kubelet.go:2113] "Starting kubelet main sync loop" Feb 9 20:32:57.327419 kubelet[2318]: E0209 20:32:57.327402 2318 kubelet.go:2137] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Feb 9 20:32:57.327629 kubelet[2318]: W0209 20:32:57.327607 2318 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://86.109.11.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:57.327629 kubelet[2318]: E0209 20:32:57.327627 2318 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://86.109.11.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:57.327000 audit[2393]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=2393 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.327000 audit[2393]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcb1044c10 a2=0 a3=7ffcb1044bfc items=0 ppid=2318 pid=2393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.327000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 9 20:32:57.327000 audit[2394]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=2394 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.327000 audit[2394]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffccb2dfdc0 a2=0 a3=7ffccb2dfdac items=0 ppid=2318 pid=2394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.327000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 9 20:32:57.327000 audit[2395]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=2395 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:32:57.327000 audit[2395]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcc2dbede0 a2=0 a3=7ffcc2dbedcc items=0 ppid=2318 pid=2395 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:32:57.327000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 9 20:32:57.425418 kubelet[2318]: E0209 20:32:57.425187 2318 controller.go:146] failed to ensure lease exists, will retry in 800ms, error: Get "https://86.109.11.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-45f40c263c?timeout=10s": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:57.428565 kubelet[2318]: I0209 20:32:57.428484 2318 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:32:57.431510 kubelet[2318]: I0209 20:32:57.431501 2318 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:32:57.432224 kubelet[2318]: I0209 20:32:57.432215 2318 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:32:57.432440 kubelet[2318]: I0209 20:32:57.432430 2318 status_manager.go:698] "Failed to get status for pod" podUID=c5f4a9c7b848be4572dae020c8b56f1e pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" err="Get \"https://86.109.11.101:6443/api/v1/namespaces/kube-system/pods/kube-apiserver-ci-3510.3.2-a-45f40c263c\": dial tcp 86.109.11.101:6443: connect: connection refused" Feb 9 20:32:57.433077 kubelet[2318]: I0209 20:32:57.433067 2318 status_manager.go:698] "Failed to get status for pod" podUID=114bcd74573bd5e3cfc5ef78b30a7c8c pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" err="Get \"https://86.109.11.101:6443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ci-3510.3.2-a-45f40c263c\": dial tcp 86.109.11.101:6443: connect: connection refused" Feb 9 20:32:57.433726 kubelet[2318]: I0209 20:32:57.433718 2318 status_manager.go:698] "Failed to get status for pod" podUID=768a6c95fed17261400746ce380c8959 pod="kube-system/kube-scheduler-ci-3510.3.2-a-45f40c263c" err="Get \"https://86.109.11.101:6443/api/v1/namespaces/kube-system/pods/kube-scheduler-ci-3510.3.2-a-45f40c263c\": dial tcp 86.109.11.101:6443: connect: connection refused" Feb 9 20:32:57.526854 kubelet[2318]: I0209 20:32:57.526768 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/768a6c95fed17261400746ce380c8959-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-45f40c263c\" (UID: \"768a6c95fed17261400746ce380c8959\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.527147 kubelet[2318]: I0209 20:32:57.526933 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5f4a9c7b848be4572dae020c8b56f1e-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-45f40c263c\" (UID: \"c5f4a9c7b848be4572dae020c8b56f1e\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.527147 kubelet[2318]: I0209 20:32:57.527019 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.527507 kubelet[2318]: I0209 20:32:57.527222 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.527507 kubelet[2318]: I0209 20:32:57.527402 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.527857 kubelet[2318]: I0209 20:32:57.527527 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5f4a9c7b848be4572dae020c8b56f1e-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-45f40c263c\" (UID: \"c5f4a9c7b848be4572dae020c8b56f1e\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.527857 kubelet[2318]: I0209 20:32:57.527648 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5f4a9c7b848be4572dae020c8b56f1e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-45f40c263c\" (UID: \"c5f4a9c7b848be4572dae020c8b56f1e\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.527857 kubelet[2318]: I0209 20:32:57.527754 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.528179 kubelet[2318]: I0209 20:32:57.527871 2318 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.530166 kubelet[2318]: I0209 20:32:57.530126 2318 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.530858 kubelet[2318]: E0209 20:32:57.530781 2318 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://86.109.11.101:6443/api/v1/nodes\": dial tcp 86.109.11.101:6443: connect: connection refused" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:57.735099 env[1563]: time="2024-02-09T20:32:57.734874968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-45f40c263c,Uid:c5f4a9c7b848be4572dae020c8b56f1e,Namespace:kube-system,Attempt:0,}" Feb 9 20:32:57.735099 env[1563]: time="2024-02-09T20:32:57.735054911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-45f40c263c,Uid:114bcd74573bd5e3cfc5ef78b30a7c8c,Namespace:kube-system,Attempt:0,}" Feb 9 20:32:57.736815 env[1563]: time="2024-02-09T20:32:57.736743230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-45f40c263c,Uid:768a6c95fed17261400746ce380c8959,Namespace:kube-system,Attempt:0,}" Feb 9 20:32:57.760193 kubelet[2318]: W0209 20:32:57.760021 2318 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://86.109.11.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:57.760193 kubelet[2318]: E0209 20:32:57.760166 2318 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://86.109.11.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:58.100295 kubelet[2318]: W0209 20:32:58.100147 2318 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://86.109.11.101:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:58.100295 kubelet[2318]: E0209 20:32:58.100268 2318 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://86.109.11.101:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:58.140882 kubelet[2318]: W0209 20:32:58.140763 2318 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://86.109.11.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:58.140882 kubelet[2318]: E0209 20:32:58.140843 2318 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://86.109.11.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:58.204717 kubelet[2318]: W0209 20:32:58.204568 2318 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://86.109.11.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-45f40c263c&limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:58.204717 kubelet[2318]: E0209 20:32:58.204690 2318 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://86.109.11.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-45f40c263c&limit=500&resourceVersion=0": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:58.227141 kubelet[2318]: E0209 20:32:58.227021 2318 controller.go:146] failed to ensure lease exists, will retry in 1.6s, error: Get "https://86.109.11.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-45f40c263c?timeout=10s": dial tcp 86.109.11.101:6443: connect: connection refused Feb 9 20:32:58.277508 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3818710387.mount: Deactivated successfully. Feb 9 20:32:58.278975 env[1563]: time="2024-02-09T20:32:58.278912745Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.280069 env[1563]: time="2024-02-09T20:32:58.280027122Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.280677 env[1563]: time="2024-02-09T20:32:58.280638316Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.281505 env[1563]: time="2024-02-09T20:32:58.281446452Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.281855 env[1563]: time="2024-02-09T20:32:58.281822325Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.283152 env[1563]: time="2024-02-09T20:32:58.283108712Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.284837 env[1563]: time="2024-02-09T20:32:58.284796595Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.285209 env[1563]: time="2024-02-09T20:32:58.285170075Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.286313 env[1563]: time="2024-02-09T20:32:58.286269125Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.286789 env[1563]: time="2024-02-09T20:32:58.286750593Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.287692 env[1563]: time="2024-02-09T20:32:58.287643610Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.288140 env[1563]: time="2024-02-09T20:32:58.288106890Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:32:58.293125 env[1563]: time="2024-02-09T20:32:58.293090808Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:32:58.293125 env[1563]: time="2024-02-09T20:32:58.293111582Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:32:58.293125 env[1563]: time="2024-02-09T20:32:58.293118359Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:32:58.293268 env[1563]: time="2024-02-09T20:32:58.293186579Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/ef06fbef21fe091661efeba7e241982cc10e15daf7f24d88b75e4f5099aab8a1 pid=2403 runtime=io.containerd.runc.v2 Feb 9 20:32:58.294439 env[1563]: time="2024-02-09T20:32:58.294393935Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:32:58.294439 env[1563]: time="2024-02-09T20:32:58.294420523Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:32:58.294439 env[1563]: time="2024-02-09T20:32:58.294431407Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:32:58.294565 env[1563]: time="2024-02-09T20:32:58.294512166Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/53e3dc95a240e4a79044e87b1c938fff0c2e87c743eccddf3d447f8e5d0b6a13 pid=2419 runtime=io.containerd.runc.v2 Feb 9 20:32:58.296492 env[1563]: time="2024-02-09T20:32:58.296461876Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:32:58.296492 env[1563]: time="2024-02-09T20:32:58.296481382Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:32:58.296492 env[1563]: time="2024-02-09T20:32:58.296488002Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:32:58.296602 env[1563]: time="2024-02-09T20:32:58.296552523Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/8b1d308c6aa1d4382e5d64f13cea2c286b5e8403dad72acc2991b5ae12e86c2d pid=2440 runtime=io.containerd.runc.v2 Feb 9 20:32:58.331865 kubelet[2318]: I0209 20:32:58.331823 2318 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:58.332071 kubelet[2318]: E0209 20:32:58.332040 2318 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://86.109.11.101:6443/api/v1/nodes\": dial tcp 86.109.11.101:6443: connect: connection refused" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:58.335052 env[1563]: time="2024-02-09T20:32:58.335026338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-45f40c263c,Uid:c5f4a9c7b848be4572dae020c8b56f1e,Namespace:kube-system,Attempt:0,} returns sandbox id \"ef06fbef21fe091661efeba7e241982cc10e15daf7f24d88b75e4f5099aab8a1\"" Feb 9 20:32:58.336511 env[1563]: time="2024-02-09T20:32:58.336461125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-45f40c263c,Uid:768a6c95fed17261400746ce380c8959,Namespace:kube-system,Attempt:0,} returns sandbox id \"8b1d308c6aa1d4382e5d64f13cea2c286b5e8403dad72acc2991b5ae12e86c2d\"" Feb 9 20:32:58.336876 env[1563]: time="2024-02-09T20:32:58.336836034Z" level=info msg="CreateContainer within sandbox \"ef06fbef21fe091661efeba7e241982cc10e15daf7f24d88b75e4f5099aab8a1\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 9 20:32:58.338448 env[1563]: time="2024-02-09T20:32:58.338432391Z" level=info msg="CreateContainer within sandbox \"8b1d308c6aa1d4382e5d64f13cea2c286b5e8403dad72acc2991b5ae12e86c2d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 9 20:32:58.342492 env[1563]: time="2024-02-09T20:32:58.342460759Z" level=info msg="CreateContainer within sandbox \"ef06fbef21fe091661efeba7e241982cc10e15daf7f24d88b75e4f5099aab8a1\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f3a5cf604a01a21ce947ffc77850e8152521f5b2975d460b05157532b33ce795\"" Feb 9 20:32:58.342726 env[1563]: time="2024-02-09T20:32:58.342701165Z" level=info msg="StartContainer for \"f3a5cf604a01a21ce947ffc77850e8152521f5b2975d460b05157532b33ce795\"" Feb 9 20:32:58.347855 env[1563]: time="2024-02-09T20:32:58.347824303Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-45f40c263c,Uid:114bcd74573bd5e3cfc5ef78b30a7c8c,Namespace:kube-system,Attempt:0,} returns sandbox id \"53e3dc95a240e4a79044e87b1c938fff0c2e87c743eccddf3d447f8e5d0b6a13\"" Feb 9 20:32:58.349007 env[1563]: time="2024-02-09T20:32:58.348965682Z" level=info msg="CreateContainer within sandbox \"53e3dc95a240e4a79044e87b1c938fff0c2e87c743eccddf3d447f8e5d0b6a13\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 9 20:32:58.352798 env[1563]: time="2024-02-09T20:32:58.352735638Z" level=info msg="CreateContainer within sandbox \"53e3dc95a240e4a79044e87b1c938fff0c2e87c743eccddf3d447f8e5d0b6a13\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ed25bd3bd611caafdcaa191ed11c37e210f283c41b5315e870eef27a153b0dfe\"" Feb 9 20:32:58.352964 env[1563]: time="2024-02-09T20:32:58.352911138Z" level=info msg="StartContainer for \"ed25bd3bd611caafdcaa191ed11c37e210f283c41b5315e870eef27a153b0dfe\"" Feb 9 20:32:58.386630 env[1563]: time="2024-02-09T20:32:58.386577045Z" level=info msg="StartContainer for \"f3a5cf604a01a21ce947ffc77850e8152521f5b2975d460b05157532b33ce795\" returns successfully" Feb 9 20:32:58.395366 env[1563]: time="2024-02-09T20:32:58.395338500Z" level=info msg="CreateContainer within sandbox \"8b1d308c6aa1d4382e5d64f13cea2c286b5e8403dad72acc2991b5ae12e86c2d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f34f99da697e273d43a09ac75e7361068bb1857b301935a24004983c082395f1\"" Feb 9 20:32:58.395673 env[1563]: time="2024-02-09T20:32:58.395653104Z" level=info msg="StartContainer for \"f34f99da697e273d43a09ac75e7361068bb1857b301935a24004983c082395f1\"" Feb 9 20:32:58.398665 env[1563]: time="2024-02-09T20:32:58.398636767Z" level=info msg="StartContainer for \"ed25bd3bd611caafdcaa191ed11c37e210f283c41b5315e870eef27a153b0dfe\" returns successfully" Feb 9 20:32:58.441445 env[1563]: time="2024-02-09T20:32:58.441388372Z" level=info msg="StartContainer for \"f34f99da697e273d43a09ac75e7361068bb1857b301935a24004983c082395f1\" returns successfully" Feb 9 20:32:59.835246 kubelet[2318]: E0209 20:32:59.835152 2318 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-3510.3.2-a-45f40c263c\" not found" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:59.936195 kubelet[2318]: I0209 20:32:59.936104 2318 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:32:59.947578 update_engine[1551]: I0209 20:32:59.947484 1551 update_attempter.cc:509] Updating boot flags... Feb 9 20:33:00.229950 kubelet[2318]: I0209 20:33:00.229747 2318 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:33:00.246253 kubelet[2318]: E0209 20:33:00.246229 2318 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-45f40c263c\" not found" Feb 9 20:33:00.347105 kubelet[2318]: E0209 20:33:00.347017 2318 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-45f40c263c\" not found" Feb 9 20:33:00.824709 kubelet[2318]: I0209 20:33:00.824616 2318 apiserver.go:52] "Watching apiserver" Feb 9 20:33:01.524605 kubelet[2318]: I0209 20:33:01.524548 2318 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 9 20:33:01.557856 kubelet[2318]: I0209 20:33:01.557761 2318 reconciler.go:41] "Reconciler: start to sync state" Feb 9 20:33:02.493326 systemd[1]: Reloading. Feb 9 20:33:02.546885 /usr/lib/systemd/system-generators/torcx-generator[2701]: time="2024-02-09T20:33:02Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 9 20:33:02.546902 /usr/lib/systemd/system-generators/torcx-generator[2701]: time="2024-02-09T20:33:02Z" level=info msg="torcx already run" Feb 9 20:33:02.601699 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 9 20:33:02.601707 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 9 20:33:02.612507 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 9 20:33:02.670572 kubelet[2318]: I0209 20:33:02.670557 2318 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 9 20:33:02.670573 systemd[1]: Stopping kubelet.service... Feb 9 20:33:02.688696 systemd[1]: kubelet.service: Deactivated successfully. Feb 9 20:33:02.688854 systemd[1]: Stopped kubelet.service. Feb 9 20:33:02.687000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:33:02.689802 systemd[1]: Started kubelet.service. Feb 9 20:33:02.715335 kubelet[2767]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 9 20:33:02.715335 kubelet[2767]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 9 20:33:02.715742 kubelet[2767]: I0209 20:33:02.715370 2767 server.go:198] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 9 20:33:02.716091 kubelet[2767]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 9 20:33:02.716091 kubelet[2767]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 9 20:33:02.716826 kernel: kauditd_printk_skb: 104 callbacks suppressed Feb 9 20:33:02.716860 kernel: audit: type=1131 audit(1707510782.687:220): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:33:02.718306 kubelet[2767]: I0209 20:33:02.718267 2767 server.go:412] "Kubelet version" kubeletVersion="v1.26.5" Feb 9 20:33:02.718306 kubelet[2767]: I0209 20:33:02.718281 2767 server.go:414] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 9 20:33:02.718486 kubelet[2767]: I0209 20:33:02.718451 2767 server.go:836] "Client rotation is on, will bootstrap in background" Feb 9 20:33:02.719305 kubelet[2767]: I0209 20:33:02.719263 2767 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 9 20:33:02.719742 kubelet[2767]: I0209 20:33:02.719702 2767 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 9 20:33:02.688000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:33:02.847319 kernel: audit: type=1130 audit(1707510782.688:221): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:33:02.860090 kubelet[2767]: I0209 20:33:02.860041 2767 server.go:659] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 9 20:33:02.860419 kubelet[2767]: I0209 20:33:02.860390 2767 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 9 20:33:02.860463 kubelet[2767]: I0209 20:33:02.860454 2767 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:cgroupfs KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:} {Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:} {Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] ExperimentalTopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 9 20:33:02.860524 kubelet[2767]: I0209 20:33:02.860470 2767 topology_manager.go:134] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 9 20:33:02.860524 kubelet[2767]: I0209 20:33:02.860480 2767 container_manager_linux.go:308] "Creating device plugin manager" Feb 9 20:33:02.860524 kubelet[2767]: I0209 20:33:02.860508 2767 state_mem.go:36] "Initialized new in-memory state store" Feb 9 20:33:02.862402 kubelet[2767]: I0209 20:33:02.862396 2767 kubelet.go:398] "Attempting to sync node with API server" Feb 9 20:33:02.862435 kubelet[2767]: I0209 20:33:02.862407 2767 kubelet.go:286] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 9 20:33:02.862435 kubelet[2767]: I0209 20:33:02.862422 2767 kubelet.go:297] "Adding apiserver pod source" Feb 9 20:33:02.862435 kubelet[2767]: I0209 20:33:02.862430 2767 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 9 20:33:02.862688 kubelet[2767]: I0209 20:33:02.862678 2767 kuberuntime_manager.go:244] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 9 20:33:02.863010 kubelet[2767]: I0209 20:33:02.863000 2767 server.go:1186] "Started kubelet" Feb 9 20:33:02.863294 kubelet[2767]: E0209 20:33:02.863281 2767 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 9 20:33:02.863356 kubelet[2767]: E0209 20:33:02.863298 2767 kubelet.go:1386] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 9 20:33:02.863467 kubelet[2767]: I0209 20:33:02.863456 2767 server.go:161] "Starting to listen" address="0.0.0.0" port=10250 Feb 9 20:33:02.865049 kubelet[2767]: I0209 20:33:02.865037 2767 server.go:451] "Adding debug handlers to kubelet server" Feb 9 20:33:02.863000 audit[2767]: AVC avc: denied { mac_admin } for pid=2767 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:33:02.865169 kubelet[2767]: I0209 20:33:02.865121 2767 kubelet.go:1341] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 9 20:33:02.865192 kubelet[2767]: I0209 20:33:02.865168 2767 kubelet.go:1345] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 9 20:33:02.865211 kubelet[2767]: I0209 20:33:02.865197 2767 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 9 20:33:02.865756 kubelet[2767]: I0209 20:33:02.865744 2767 volume_manager.go:293] "Starting Kubelet Volume Manager" Feb 9 20:33:02.865809 kubelet[2767]: I0209 20:33:02.865786 2767 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 9 20:33:02.876044 kubelet[2767]: I0209 20:33:02.876001 2767 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 9 20:33:02.883144 kubelet[2767]: I0209 20:33:02.883124 2767 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 9 20:33:02.883144 kubelet[2767]: I0209 20:33:02.883144 2767 status_manager.go:176] "Starting to sync pod status with apiserver" Feb 9 20:33:02.883270 kubelet[2767]: I0209 20:33:02.883165 2767 kubelet.go:2113] "Starting kubelet main sync loop" Feb 9 20:33:02.883270 kubelet[2767]: E0209 20:33:02.883222 2767 kubelet.go:2137] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 9 20:33:02.903018 kubelet[2767]: I0209 20:33:02.903000 2767 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 9 20:33:02.903018 kubelet[2767]: I0209 20:33:02.903014 2767 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 9 20:33:02.903018 kubelet[2767]: I0209 20:33:02.903023 2767 state_mem.go:36] "Initialized new in-memory state store" Feb 9 20:33:02.903147 kubelet[2767]: I0209 20:33:02.903125 2767 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 9 20:33:02.903147 kubelet[2767]: I0209 20:33:02.903133 2767 state_mem.go:96] "Updated CPUSet assignments" assignments=map[] Feb 9 20:33:02.903147 kubelet[2767]: I0209 20:33:02.903137 2767 policy_none.go:49] "None policy: Start" Feb 9 20:33:02.903389 kubelet[2767]: I0209 20:33:02.903382 2767 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 9 20:33:02.903415 kubelet[2767]: I0209 20:33:02.903392 2767 state_mem.go:35] "Initializing new in-memory state store" Feb 9 20:33:02.903469 kubelet[2767]: I0209 20:33:02.903464 2767 state_mem.go:75] "Updated machine memory state" Feb 9 20:33:02.904091 kubelet[2767]: I0209 20:33:02.904085 2767 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 9 20:33:02.904129 kubelet[2767]: I0209 20:33:02.904122 2767 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 9 20:33:02.904290 kubelet[2767]: I0209 20:33:02.904285 2767 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 9 20:33:02.863000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:33:02.962556 kernel: audit: type=1400 audit(1707510782.863:222): avc: denied { mac_admin } for pid=2767 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:33:02.962598 kernel: audit: type=1401 audit(1707510782.863:222): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:33:02.962611 kernel: audit: type=1300 audit(1707510782.863:222): arch=c000003e syscall=188 success=no exit=-22 a0=c000be0030 a1=c000bdc018 a2=c000be0000 a3=25 items=0 ppid=1 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:02.863000 audit[2767]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000be0030 a1=c000bdc018 a2=c000be0000 a3=25 items=0 ppid=1 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:02.968413 kubelet[2767]: I0209 20:33:02.968403 2767 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:33:02.973077 kubelet[2767]: I0209 20:33:02.973067 2767 kubelet_node_status.go:108] "Node was previously registered" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:33:02.973124 kubelet[2767]: I0209 20:33:02.973108 2767 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-45f40c263c" Feb 9 20:33:02.983643 kubelet[2767]: I0209 20:33:02.983605 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:02.983643 kubelet[2767]: I0209 20:33:02.983644 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:02.983711 kubelet[2767]: I0209 20:33:02.983659 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:02.986965 kubelet[2767]: E0209 20:33:02.986955 2767 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-45f40c263c\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.058411 kernel: audit: type=1327 audit(1707510782.863:222): proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:33:02.863000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:33:03.066937 kubelet[2767]: I0209 20:33:03.066924 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5f4a9c7b848be4572dae020c8b56f1e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-45f40c263c\" (UID: \"c5f4a9c7b848be4572dae020c8b56f1e\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.067021 kubelet[2767]: I0209 20:33:03.066945 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.067021 kubelet[2767]: I0209 20:33:03.066960 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.067021 kubelet[2767]: I0209 20:33:03.066973 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/768a6c95fed17261400746ce380c8959-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-45f40c263c\" (UID: \"768a6c95fed17261400746ce380c8959\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.067021 kubelet[2767]: I0209 20:33:03.066985 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5f4a9c7b848be4572dae020c8b56f1e-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-45f40c263c\" (UID: \"c5f4a9c7b848be4572dae020c8b56f1e\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.067021 kubelet[2767]: I0209 20:33:03.067018 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5f4a9c7b848be4572dae020c8b56f1e-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-45f40c263c\" (UID: \"c5f4a9c7b848be4572dae020c8b56f1e\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.067162 kubelet[2767]: I0209 20:33:03.067046 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.067162 kubelet[2767]: I0209 20:33:03.067069 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:03.067162 kubelet[2767]: I0209 20:33:03.067088 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/114bcd74573bd5e3cfc5ef78b30a7c8c-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" (UID: \"114bcd74573bd5e3cfc5ef78b30a7c8c\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:02.863000 audit[2767]: AVC avc: denied { mac_admin } for pid=2767 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:33:03.215132 kernel: audit: type=1400 audit(1707510782.863:223): avc: denied { mac_admin } for pid=2767 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:33:03.215180 kernel: audit: type=1401 audit(1707510782.863:223): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:33:02.863000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:33:03.247682 kernel: audit: type=1300 audit(1707510782.863:223): arch=c000003e syscall=188 success=no exit=-22 a0=c000bba0e0 a1=c000bdc030 a2=c000be00c0 a3=25 items=0 ppid=1 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:02.863000 audit[2767]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000bba0e0 a1=c000bdc030 a2=c000be00c0 a3=25 items=0 ppid=1 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:03.342896 kernel: audit: type=1327 audit(1707510782.863:223): proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:33:02.863000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:33:02.903000 audit[2767]: AVC avc: denied { mac_admin } for pid=2767 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 9 20:33:02.903000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 9 20:33:02.903000 audit[2767]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000e5fbf0 a1=c000bdd350 a2=c000e5fbc0 a3=25 items=0 ppid=1 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:02.903000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 9 20:33:03.862847 kubelet[2767]: I0209 20:33:03.862721 2767 apiserver.go:52] "Watching apiserver" Feb 9 20:33:03.966325 kubelet[2767]: I0209 20:33:03.966189 2767 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 9 20:33:03.972515 kubelet[2767]: I0209 20:33:03.972429 2767 reconciler.go:41] "Reconciler: start to sync state" Feb 9 20:33:04.271944 kubelet[2767]: E0209 20:33:04.271788 2767 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-3510.3.2-a-45f40c263c\" already exists" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:04.472224 kubelet[2767]: E0209 20:33:04.472125 2767 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-45f40c263c\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:04.670740 kubelet[2767]: E0209 20:33:04.670682 2767 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-3510.3.2-a-45f40c263c\" already exists" pod="kube-system/kube-scheduler-ci-3510.3.2-a-45f40c263c" Feb 9 20:33:04.872200 kubelet[2767]: I0209 20:33:04.872178 2767 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-45f40c263c" podStartSLOduration=1.872135313 pod.CreationTimestamp="2024-02-09 20:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-09 20:33:04.871971662 +0000 UTC m=+2.180399947" watchObservedRunningTime="2024-02-09 20:33:04.872135313 +0000 UTC m=+2.180563598" Feb 9 20:33:05.267865 kubelet[2767]: I0209 20:33:05.267817 2767 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-3510.3.2-a-45f40c263c" podStartSLOduration=4.267784049 pod.CreationTimestamp="2024-02-09 20:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-09 20:33:05.267683336 +0000 UTC m=+2.576111623" watchObservedRunningTime="2024-02-09 20:33:05.267784049 +0000 UTC m=+2.576212333" Feb 9 20:33:05.666776 kubelet[2767]: I0209 20:33:05.666759 2767 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-3510.3.2-a-45f40c263c" podStartSLOduration=2.666737429 pod.CreationTimestamp="2024-02-09 20:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-09 20:33:05.666625177 +0000 UTC m=+2.975053461" watchObservedRunningTime="2024-02-09 20:33:05.666737429 +0000 UTC m=+2.975165710" Feb 9 20:33:08.034575 sudo[1766]: pam_unix(sudo:session): session closed for user root Feb 9 20:33:08.033000 audit[1766]: USER_END pid=1766 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:33:08.060551 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 9 20:33:08.060592 kernel: audit: type=1106 audit(1707510788.033:225): pid=1766 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:33:08.060808 sshd[1762]: pam_unix(sshd:session): session closed for user core Feb 9 20:33:08.062887 systemd[1]: sshd@8-86.109.11.101:22-139.178.89.65:49828.service: Deactivated successfully. Feb 9 20:33:08.064012 systemd-logind[1548]: Session 11 logged out. Waiting for processes to exit. Feb 9 20:33:08.064067 systemd[1]: session-11.scope: Deactivated successfully. Feb 9 20:33:08.064703 systemd-logind[1548]: Removed session 11. Feb 9 20:33:08.033000 audit[1766]: CRED_DISP pid=1766 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:33:08.232749 kernel: audit: type=1104 audit(1707510788.033:226): pid=1766 uid=500 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 9 20:33:08.232780 kernel: audit: type=1106 audit(1707510788.060:227): pid=1762 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:33:08.060000 audit[1762]: USER_END pid=1762 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:33:08.325918 kernel: audit: type=1104 audit(1707510788.060:228): pid=1762 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:33:08.060000 audit[1762]: CRED_DISP pid=1762 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:33:08.413132 kernel: audit: type=1131 audit(1707510788.061:229): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-86.109.11.101:22-139.178.89.65:49828 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:33:08.061000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-86.109.11.101:22-139.178.89.65:49828 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:33:14.356424 kubelet[2767]: I0209 20:33:14.356383 2767 kuberuntime_manager.go:1114] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 9 20:33:14.356702 kubelet[2767]: I0209 20:33:14.356673 2767 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 9 20:33:14.356727 env[1563]: time="2024-02-09T20:33:14.356580573Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 9 20:33:15.106767 kubelet[2767]: I0209 20:33:15.106740 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:15.155888 kubelet[2767]: I0209 20:33:15.155794 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e4b4ead4-71c4-4552-bcb4-a2d54e4e4af1-var-lib-calico\") pod \"tigera-operator-cfc98749c-vczrx\" (UID: \"e4b4ead4-71c4-4552-bcb4-a2d54e4e4af1\") " pod="tigera-operator/tigera-operator-cfc98749c-vczrx" Feb 9 20:33:15.156158 kubelet[2767]: I0209 20:33:15.155955 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vj7\" (UniqueName: \"kubernetes.io/projected/e4b4ead4-71c4-4552-bcb4-a2d54e4e4af1-kube-api-access-s7vj7\") pod \"tigera-operator-cfc98749c-vczrx\" (UID: \"e4b4ead4-71c4-4552-bcb4-a2d54e4e4af1\") " pod="tigera-operator/tigera-operator-cfc98749c-vczrx" Feb 9 20:33:15.314067 kubelet[2767]: I0209 20:33:15.313998 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:15.357819 kubelet[2767]: I0209 20:33:15.357696 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0c878ad5-4ed9-4131-ba8e-60fcd6923264-xtables-lock\") pod \"kube-proxy-lnfz4\" (UID: \"0c878ad5-4ed9-4131-ba8e-60fcd6923264\") " pod="kube-system/kube-proxy-lnfz4" Feb 9 20:33:15.357819 kubelet[2767]: I0209 20:33:15.357754 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c878ad5-4ed9-4131-ba8e-60fcd6923264-lib-modules\") pod \"kube-proxy-lnfz4\" (UID: \"0c878ad5-4ed9-4131-ba8e-60fcd6923264\") " pod="kube-system/kube-proxy-lnfz4" Feb 9 20:33:15.357819 kubelet[2767]: I0209 20:33:15.357806 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8czqm\" (UniqueName: \"kubernetes.io/projected/0c878ad5-4ed9-4131-ba8e-60fcd6923264-kube-api-access-8czqm\") pod \"kube-proxy-lnfz4\" (UID: \"0c878ad5-4ed9-4131-ba8e-60fcd6923264\") " pod="kube-system/kube-proxy-lnfz4" Feb 9 20:33:15.358413 kubelet[2767]: I0209 20:33:15.357841 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0c878ad5-4ed9-4131-ba8e-60fcd6923264-kube-proxy\") pod \"kube-proxy-lnfz4\" (UID: \"0c878ad5-4ed9-4131-ba8e-60fcd6923264\") " pod="kube-system/kube-proxy-lnfz4" Feb 9 20:33:15.410772 env[1563]: time="2024-02-09T20:33:15.410638905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-cfc98749c-vczrx,Uid:e4b4ead4-71c4-4552-bcb4-a2d54e4e4af1,Namespace:tigera-operator,Attempt:0,}" Feb 9 20:33:15.437789 env[1563]: time="2024-02-09T20:33:15.437573107Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:33:15.437789 env[1563]: time="2024-02-09T20:33:15.437672088Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:33:15.437789 env[1563]: time="2024-02-09T20:33:15.437710809Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:33:15.438259 env[1563]: time="2024-02-09T20:33:15.438050524Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/221ace1df5ff15b9b1231f0432bfed03225c823559b62ddb8412bbcb3f506451 pid=2956 runtime=io.containerd.runc.v2 Feb 9 20:33:15.545920 env[1563]: time="2024-02-09T20:33:15.545888230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-cfc98749c-vczrx,Uid:e4b4ead4-71c4-4552-bcb4-a2d54e4e4af1,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"221ace1df5ff15b9b1231f0432bfed03225c823559b62ddb8412bbcb3f506451\"" Feb 9 20:33:15.546804 env[1563]: time="2024-02-09T20:33:15.546759028Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\"" Feb 9 20:33:15.918626 env[1563]: time="2024-02-09T20:33:15.918497368Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lnfz4,Uid:0c878ad5-4ed9-4131-ba8e-60fcd6923264,Namespace:kube-system,Attempt:0,}" Feb 9 20:33:15.944289 env[1563]: time="2024-02-09T20:33:15.944091273Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:33:15.944289 env[1563]: time="2024-02-09T20:33:15.944200526Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:33:15.944685 env[1563]: time="2024-02-09T20:33:15.944265370Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:33:15.944907 env[1563]: time="2024-02-09T20:33:15.944787455Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/8722ff125143188d4c53b536db2faaba36feaa02224ff48b59b2d2292afa5dea pid=2997 runtime=io.containerd.runc.v2 Feb 9 20:33:16.032804 env[1563]: time="2024-02-09T20:33:16.032700024Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lnfz4,Uid:0c878ad5-4ed9-4131-ba8e-60fcd6923264,Namespace:kube-system,Attempt:0,} returns sandbox id \"8722ff125143188d4c53b536db2faaba36feaa02224ff48b59b2d2292afa5dea\"" Feb 9 20:33:16.042225 env[1563]: time="2024-02-09T20:33:16.042139025Z" level=info msg="CreateContainer within sandbox \"8722ff125143188d4c53b536db2faaba36feaa02224ff48b59b2d2292afa5dea\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 9 20:33:16.059175 env[1563]: time="2024-02-09T20:33:16.059063164Z" level=info msg="CreateContainer within sandbox \"8722ff125143188d4c53b536db2faaba36feaa02224ff48b59b2d2292afa5dea\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ea2bbbc4413db7aeed7aa22559108971b9c1f6a7a479f3c627f77683bad3df34\"" Feb 9 20:33:16.060189 env[1563]: time="2024-02-09T20:33:16.060064727Z" level=info msg="StartContainer for \"ea2bbbc4413db7aeed7aa22559108971b9c1f6a7a479f3c627f77683bad3df34\"" Feb 9 20:33:16.166773 env[1563]: time="2024-02-09T20:33:16.166695393Z" level=info msg="StartContainer for \"ea2bbbc4413db7aeed7aa22559108971b9c1f6a7a479f3c627f77683bad3df34\" returns successfully" Feb 9 20:33:16.208000 audit[3096]: NETFILTER_CFG table=mangle:59 family=2 entries=1 op=nft_register_chain pid=3096 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.208000 audit[3096]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc10eaf500 a2=0 a3=7ffc10eaf4ec items=0 ppid=3048 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.362393 kernel: audit: type=1325 audit(1707510796.208:230): table=mangle:59 family=2 entries=1 op=nft_register_chain pid=3096 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.362452 kernel: audit: type=1300 audit(1707510796.208:230): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc10eaf500 a2=0 a3=7ffc10eaf4ec items=0 ppid=3048 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.362469 kernel: audit: type=1327 audit(1707510796.208:230): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 9 20:33:16.208000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 9 20:33:16.420372 kernel: audit: type=1325 audit(1707510796.208:231): table=mangle:60 family=10 entries=1 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.208000 audit[3097]: NETFILTER_CFG table=mangle:60 family=10 entries=1 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.478327 kernel: audit: type=1300 audit(1707510796.208:231): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff329adf00 a2=0 a3=7fff329adeec items=0 ppid=3048 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.208000 audit[3097]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff329adf00 a2=0 a3=7fff329adeec items=0 ppid=3048 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.574626 kernel: audit: type=1327 audit(1707510796.208:231): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 9 20:33:16.208000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 9 20:33:16.209000 audit[3098]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=3098 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.690564 kernel: audit: type=1325 audit(1707510796.209:232): table=nat:61 family=2 entries=1 op=nft_register_chain pid=3098 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.690601 kernel: audit: type=1300 audit(1707510796.209:232): arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffebdc12e00 a2=0 a3=7ffebdc12dec items=0 ppid=3048 pid=3098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.209000 audit[3098]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffebdc12e00 a2=0 a3=7ffebdc12dec items=0 ppid=3048 pid=3098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.787314 kernel: audit: type=1327 audit(1707510796.209:232): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 9 20:33:16.209000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 9 20:33:16.845303 kernel: audit: type=1325 audit(1707510796.210:233): table=nat:62 family=10 entries=1 op=nft_register_chain pid=3099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.210000 audit[3099]: NETFILTER_CFG table=nat:62 family=10 entries=1 op=nft_register_chain pid=3099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.210000 audit[3099]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffe046b490 a2=0 a3=7fffe046b47c items=0 ppid=3048 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.210000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 9 20:33:16.210000 audit[3103]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3103 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.210000 audit[3103]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd509ec270 a2=0 a3=7ffd509ec25c items=0 ppid=3048 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.210000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 9 20:33:16.210000 audit[3104]: NETFILTER_CFG table=filter:64 family=10 entries=1 op=nft_register_chain pid=3104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.210000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd4664b810 a2=0 a3=7ffd4664b7fc items=0 ppid=3048 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.210000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 9 20:33:16.313000 audit[3105]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3105 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.313000 audit[3105]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffd2b38a880 a2=0 a3=7ffd2b38a86c items=0 ppid=3048 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.313000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 9 20:33:16.314000 audit[3107]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3107 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.314000 audit[3107]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffddcd1b230 a2=0 a3=7ffddcd1b21c items=0 ppid=3048 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.314000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Feb 9 20:33:16.316000 audit[3110]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3110 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.316000 audit[3110]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffec4208960 a2=0 a3=7ffec420894c items=0 ppid=3048 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.316000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Feb 9 20:33:16.316000 audit[3111]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.316000 audit[3111]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe179dd8e0 a2=0 a3=7ffe179dd8cc items=0 ppid=3048 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.316000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 9 20:33:16.318000 audit[3113]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.318000 audit[3113]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe78927600 a2=0 a3=7ffe789275ec items=0 ppid=3048 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.318000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 9 20:33:16.318000 audit[3114]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3114 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.318000 audit[3114]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcfd367180 a2=0 a3=7ffcfd36716c items=0 ppid=3048 pid=3114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.318000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 9 20:33:16.319000 audit[3116]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.319000 audit[3116]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd1c9a6360 a2=0 a3=7ffd1c9a634c items=0 ppid=3048 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.319000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 9 20:33:16.321000 audit[3119]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.321000 audit[3119]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff7e04daa0 a2=0 a3=7fff7e04da8c items=0 ppid=3048 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.321000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Feb 9 20:33:16.322000 audit[3120]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_chain pid=3120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.322000 audit[3120]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc7bc91fb0 a2=0 a3=7ffc7bc91f9c items=0 ppid=3048 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.322000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 9 20:33:16.323000 audit[3122]: NETFILTER_CFG table=filter:74 family=2 entries=1 op=nft_register_rule pid=3122 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.323000 audit[3122]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe9fe2ef70 a2=0 a3=7ffe9fe2ef5c items=0 ppid=3048 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.323000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 9 20:33:16.324000 audit[3123]: NETFILTER_CFG table=filter:75 family=2 entries=1 op=nft_register_chain pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.324000 audit[3123]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd01bc48a0 a2=0 a3=7ffd01bc488c items=0 ppid=3048 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.324000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 9 20:33:16.904000 audit[3125]: NETFILTER_CFG table=filter:76 family=2 entries=1 op=nft_register_rule pid=3125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.904000 audit[3125]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffefb62a3d0 a2=0 a3=7ffefb62a3bc items=0 ppid=3048 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.904000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 9 20:33:16.906000 audit[3128]: NETFILTER_CFG table=filter:77 family=2 entries=1 op=nft_register_rule pid=3128 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.906000 audit[3128]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff7969f0b0 a2=0 a3=7fff7969f09c items=0 ppid=3048 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.906000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 9 20:33:16.908000 audit[3131]: NETFILTER_CFG table=filter:78 family=2 entries=1 op=nft_register_rule pid=3131 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.908000 audit[3131]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff32aa1000 a2=0 a3=7fff32aa0fec items=0 ppid=3048 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.908000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 9 20:33:16.909000 audit[3132]: NETFILTER_CFG table=nat:79 family=2 entries=1 op=nft_register_chain pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.909000 audit[3132]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd2eafae00 a2=0 a3=7ffd2eafadec items=0 ppid=3048 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.909000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 9 20:33:16.910000 audit[3134]: NETFILTER_CFG table=nat:80 family=2 entries=1 op=nft_register_rule pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.910000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffdaef04660 a2=0 a3=7ffdaef0464c items=0 ppid=3048 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.910000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 9 20:33:16.912000 audit[3137]: NETFILTER_CFG table=nat:81 family=2 entries=1 op=nft_register_rule pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 9 20:33:16.912000 audit[3137]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffed9a06c30 a2=0 a3=7ffed9a06c1c items=0 ppid=3048 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.912000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 9 20:33:16.923358 kubelet[2767]: I0209 20:33:16.923327 2767 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-lnfz4" podStartSLOduration=1.923297053 pod.CreationTimestamp="2024-02-09 20:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-09 20:33:16.923009869 +0000 UTC m=+14.231438155" watchObservedRunningTime="2024-02-09 20:33:16.923297053 +0000 UTC m=+14.231725335" Feb 9 20:33:16.924000 audit[3141]: NETFILTER_CFG table=filter:82 family=2 entries=6 op=nft_register_rule pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:16.924000 audit[3141]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7fff33fac340 a2=0 a3=7fff33fac32c items=0 ppid=3048 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.924000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:16.928000 audit[3141]: NETFILTER_CFG table=nat:83 family=2 entries=17 op=nft_register_chain pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:16.928000 audit[3141]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7fff33fac340 a2=0 a3=7fff33fac32c items=0 ppid=3048 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.928000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:16.946000 audit[3145]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3145 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.946000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffc462f4760 a2=0 a3=7ffc462f474c items=0 ppid=3048 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.946000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 9 20:33:16.952000 audit[3147]: NETFILTER_CFG table=filter:85 family=10 entries=2 op=nft_register_chain pid=3147 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.952000 audit[3147]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7fff11d300d0 a2=0 a3=7fff11d300bc items=0 ppid=3048 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.952000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Feb 9 20:33:16.960000 audit[3150]: NETFILTER_CFG table=filter:86 family=10 entries=2 op=nft_register_chain pid=3150 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.960000 audit[3150]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffdc8642de0 a2=0 a3=7ffdc8642dcc items=0 ppid=3048 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.960000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Feb 9 20:33:16.963000 audit[3151]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_chain pid=3151 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.963000 audit[3151]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff37528c30 a2=0 a3=7fff37528c1c items=0 ppid=3048 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.963000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 9 20:33:16.970000 audit[3153]: NETFILTER_CFG table=filter:88 family=10 entries=1 op=nft_register_rule pid=3153 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.970000 audit[3153]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd0eeeadf0 a2=0 a3=7ffd0eeeaddc items=0 ppid=3048 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.970000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 9 20:33:16.972000 audit[3154]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3154 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.972000 audit[3154]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc648eb640 a2=0 a3=7ffc648eb62c items=0 ppid=3048 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.972000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 9 20:33:16.979000 audit[3156]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.979000 audit[3156]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff0968c000 a2=0 a3=7fff0968bfec items=0 ppid=3048 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.979000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Feb 9 20:33:16.981417 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3413948207.mount: Deactivated successfully. Feb 9 20:33:16.988000 audit[3159]: NETFILTER_CFG table=filter:91 family=10 entries=2 op=nft_register_chain pid=3159 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.988000 audit[3159]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffe06d942c0 a2=0 a3=7ffe06d942ac items=0 ppid=3048 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.988000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 9 20:33:16.991000 audit[3160]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_chain pid=3160 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.991000 audit[3160]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff0b1ee530 a2=0 a3=7fff0b1ee51c items=0 ppid=3048 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.991000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 9 20:33:16.996000 audit[3162]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3162 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.996000 audit[3162]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd0cbc4da0 a2=0 a3=7ffd0cbc4d8c items=0 ppid=3048 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.996000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 9 20:33:16.998000 audit[3163]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_chain pid=3163 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:16.998000 audit[3163]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc9f2107f0 a2=0 a3=7ffc9f2107dc items=0 ppid=3048 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:16.998000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 9 20:33:17.002000 audit[3165]: NETFILTER_CFG table=filter:95 family=10 entries=1 op=nft_register_rule pid=3165 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:17.002000 audit[3165]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffe81c4520 a2=0 a3=7fffe81c450c items=0 ppid=3048 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:17.002000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 9 20:33:17.007000 audit[3168]: NETFILTER_CFG table=filter:96 family=10 entries=1 op=nft_register_rule pid=3168 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:17.007000 audit[3168]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffeb7c268b0 a2=0 a3=7ffeb7c2689c items=0 ppid=3048 pid=3168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:17.007000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 9 20:33:17.011000 audit[3171]: NETFILTER_CFG table=filter:97 family=10 entries=1 op=nft_register_rule pid=3171 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:17.011000 audit[3171]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd14909ec0 a2=0 a3=7ffd14909eac items=0 ppid=3048 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:17.011000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Feb 9 20:33:17.012000 audit[3172]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3172 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:17.012000 audit[3172]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffce73cf2a0 a2=0 a3=7ffce73cf28c items=0 ppid=3048 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:17.012000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 9 20:33:17.014000 audit[3174]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3174 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:17.014000 audit[3174]: SYSCALL arch=c000003e syscall=46 success=yes exit=600 a0=3 a1=7ffee8ccb510 a2=0 a3=7ffee8ccb4fc items=0 ppid=3048 pid=3174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:17.014000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 9 20:33:17.017000 audit[3177]: NETFILTER_CFG table=nat:100 family=10 entries=2 op=nft_register_chain pid=3177 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 9 20:33:17.017000 audit[3177]: SYSCALL arch=c000003e syscall=46 success=yes exit=608 a0=3 a1=7ffea035cf40 a2=0 a3=7ffea035cf2c items=0 ppid=3048 pid=3177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:17.017000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 9 20:33:17.021000 audit[3181]: NETFILTER_CFG table=filter:101 family=10 entries=3 op=nft_register_rule pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 9 20:33:17.021000 audit[3181]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7fffdfe508b0 a2=0 a3=7fffdfe5089c items=0 ppid=3048 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:17.021000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:17.021000 audit[3181]: NETFILTER_CFG table=nat:102 family=10 entries=10 op=nft_register_chain pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 9 20:33:17.021000 audit[3181]: SYSCALL arch=c000003e syscall=46 success=yes exit=1968 a0=3 a1=7fffdfe508b0 a2=0 a3=7fffdfe5089c items=0 ppid=3048 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:17.021000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:18.170494 env[1563]: time="2024-02-09T20:33:18.170472445Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:18.171001 env[1563]: time="2024-02-09T20:33:18.170990214Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:18.172147 env[1563]: time="2024-02-09T20:33:18.172133345Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:18.172883 env[1563]: time="2024-02-09T20:33:18.172858779Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:715ac9a30f8a9579e44258af20de354715429e11836b493918e9e1a696e9b028,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:18.173225 env[1563]: time="2024-02-09T20:33:18.173197603Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\" returns image reference \"sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827\"" Feb 9 20:33:18.174321 env[1563]: time="2024-02-09T20:33:18.174306578Z" level=info msg="CreateContainer within sandbox \"221ace1df5ff15b9b1231f0432bfed03225c823559b62ddb8412bbcb3f506451\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 9 20:33:18.178576 env[1563]: time="2024-02-09T20:33:18.178533921Z" level=info msg="CreateContainer within sandbox \"221ace1df5ff15b9b1231f0432bfed03225c823559b62ddb8412bbcb3f506451\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"953c7136262388e13d7718ef1b56b04bafe1b9843eb9392829fb17703a591d71\"" Feb 9 20:33:18.178864 env[1563]: time="2024-02-09T20:33:18.178832567Z" level=info msg="StartContainer for \"953c7136262388e13d7718ef1b56b04bafe1b9843eb9392829fb17703a591d71\"" Feb 9 20:33:18.225292 env[1563]: time="2024-02-09T20:33:18.225263513Z" level=info msg="StartContainer for \"953c7136262388e13d7718ef1b56b04bafe1b9843eb9392829fb17703a591d71\" returns successfully" Feb 9 20:33:18.935810 kubelet[2767]: I0209 20:33:18.935792 2767 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-cfc98749c-vczrx" podStartSLOduration=-9.223372032919008e+09 pod.CreationTimestamp="2024-02-09 20:33:15 +0000 UTC" firstStartedPulling="2024-02-09 20:33:15.546521344 +0000 UTC m=+12.854949628" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-09 20:33:18.935719057 +0000 UTC m=+16.244147341" watchObservedRunningTime="2024-02-09 20:33:18.935767864 +0000 UTC m=+16.244196145" Feb 9 20:33:19.896000 audit[3255]: NETFILTER_CFG table=filter:103 family=2 entries=13 op=nft_register_rule pid=3255 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:19.896000 audit[3255]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffe9a683cc0 a2=0 a3=7ffe9a683cac items=0 ppid=3048 pid=3255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:19.896000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:19.897000 audit[3255]: NETFILTER_CFG table=nat:104 family=2 entries=20 op=nft_register_rule pid=3255 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:19.897000 audit[3255]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffe9a683cc0 a2=0 a3=7ffe9a683cac items=0 ppid=3048 pid=3255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:19.897000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:19.953641 kubelet[2767]: I0209 20:33:19.953577 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:19.983000 audit[3281]: NETFILTER_CFG table=filter:105 family=2 entries=14 op=nft_register_rule pid=3281 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:19.983000 audit[3281]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffca8d3e9b0 a2=0 a3=7ffca8d3e99c items=0 ppid=3048 pid=3281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:19.983000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:19.986838 kubelet[2767]: I0209 20:33:19.986794 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj9t\" (UniqueName: \"kubernetes.io/projected/a029d566-7677-4b14-8b19-30eeaad7a78d-kube-api-access-wtj9t\") pod \"calico-typha-8546f8997d-f95j4\" (UID: \"a029d566-7677-4b14-8b19-30eeaad7a78d\") " pod="calico-system/calico-typha-8546f8997d-f95j4" Feb 9 20:33:19.986838 kubelet[2767]: I0209 20:33:19.986840 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a029d566-7677-4b14-8b19-30eeaad7a78d-typha-certs\") pod \"calico-typha-8546f8997d-f95j4\" (UID: \"a029d566-7677-4b14-8b19-30eeaad7a78d\") " pod="calico-system/calico-typha-8546f8997d-f95j4" Feb 9 20:33:19.986967 kubelet[2767]: I0209 20:33:19.986950 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a029d566-7677-4b14-8b19-30eeaad7a78d-tigera-ca-bundle\") pod \"calico-typha-8546f8997d-f95j4\" (UID: \"a029d566-7677-4b14-8b19-30eeaad7a78d\") " pod="calico-system/calico-typha-8546f8997d-f95j4" Feb 9 20:33:19.989068 kubelet[2767]: I0209 20:33:19.989051 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:19.984000 audit[3281]: NETFILTER_CFG table=nat:106 family=2 entries=20 op=nft_register_rule pid=3281 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:19.984000 audit[3281]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffca8d3e9b0 a2=0 a3=7ffca8d3e99c items=0 ppid=3048 pid=3281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:19.984000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:20.087812 kubelet[2767]: I0209 20:33:20.087752 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-var-lib-calico\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.088172 kubelet[2767]: I0209 20:33:20.087869 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-xtables-lock\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.088172 kubelet[2767]: I0209 20:33:20.088030 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-flexvol-driver-host\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.088560 kubelet[2767]: I0209 20:33:20.088332 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-policysync\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.088560 kubelet[2767]: I0209 20:33:20.088525 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-var-run-calico\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.088906 kubelet[2767]: I0209 20:33:20.088775 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03330e2-7655-4766-b4a3-8964354a083e-tigera-ca-bundle\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.088906 kubelet[2767]: I0209 20:33:20.088882 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-lib-modules\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.089299 kubelet[2767]: I0209 20:33:20.089027 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a03330e2-7655-4766-b4a3-8964354a083e-node-certs\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.089299 kubelet[2767]: I0209 20:33:20.089147 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-bin-dir\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.089299 kubelet[2767]: I0209 20:33:20.089215 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-net-dir\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.089820 kubelet[2767]: I0209 20:33:20.089408 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-log-dir\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.089820 kubelet[2767]: I0209 20:33:20.089586 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94drh\" (UniqueName: \"kubernetes.io/projected/a03330e2-7655-4766-b4a3-8964354a083e-kube-api-access-94drh\") pod \"calico-node-kl8ql\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " pod="calico-system/calico-node-kl8ql" Feb 9 20:33:20.120478 kubelet[2767]: I0209 20:33:20.120417 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:20.121105 kubelet[2767]: E0209 20:33:20.121059 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:20.190376 kubelet[2767]: I0209 20:33:20.190302 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/21aae8c4-8c7c-48d6-86a1-b78761bdb569-registration-dir\") pod \"csi-node-driver-dx2ql\" (UID: \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\") " pod="calico-system/csi-node-driver-dx2ql" Feb 9 20:33:20.190492 kubelet[2767]: I0209 20:33:20.190400 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/21aae8c4-8c7c-48d6-86a1-b78761bdb569-varrun\") pod \"csi-node-driver-dx2ql\" (UID: \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\") " pod="calico-system/csi-node-driver-dx2ql" Feb 9 20:33:20.190492 kubelet[2767]: I0209 20:33:20.190426 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21aae8c4-8c7c-48d6-86a1-b78761bdb569-kubelet-dir\") pod \"csi-node-driver-dx2ql\" (UID: \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\") " pod="calico-system/csi-node-driver-dx2ql" Feb 9 20:33:20.190710 kubelet[2767]: E0209 20:33:20.190700 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.190710 kubelet[2767]: W0209 20:33:20.190709 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.190789 kubelet[2767]: E0209 20:33:20.190724 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.190903 kubelet[2767]: E0209 20:33:20.190894 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.190903 kubelet[2767]: W0209 20:33:20.190902 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.190980 kubelet[2767]: E0209 20:33:20.190913 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.191061 kubelet[2767]: E0209 20:33:20.191052 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.191101 kubelet[2767]: W0209 20:33:20.191062 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.191101 kubelet[2767]: E0209 20:33:20.191073 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.191201 kubelet[2767]: E0209 20:33:20.191194 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.191201 kubelet[2767]: W0209 20:33:20.191201 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.191274 kubelet[2767]: E0209 20:33:20.191211 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.191274 kubelet[2767]: I0209 20:33:20.191224 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/21aae8c4-8c7c-48d6-86a1-b78761bdb569-socket-dir\") pod \"csi-node-driver-dx2ql\" (UID: \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\") " pod="calico-system/csi-node-driver-dx2ql" Feb 9 20:33:20.191404 kubelet[2767]: E0209 20:33:20.191322 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.191404 kubelet[2767]: W0209 20:33:20.191331 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.191404 kubelet[2767]: E0209 20:33:20.191351 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.191509 kubelet[2767]: E0209 20:33:20.191489 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.191509 kubelet[2767]: W0209 20:33:20.191496 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.191509 kubelet[2767]: E0209 20:33:20.191507 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.191627 kubelet[2767]: E0209 20:33:20.191608 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.191627 kubelet[2767]: W0209 20:33:20.191616 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.191695 kubelet[2767]: E0209 20:33:20.191635 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.191730 kubelet[2767]: E0209 20:33:20.191706 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.191730 kubelet[2767]: W0209 20:33:20.191711 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.191730 kubelet[2767]: E0209 20:33:20.191724 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.191825 kubelet[2767]: E0209 20:33:20.191785 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.191825 kubelet[2767]: W0209 20:33:20.191790 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.191825 kubelet[2767]: E0209 20:33:20.191798 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.191923 kubelet[2767]: E0209 20:33:20.191882 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.191923 kubelet[2767]: W0209 20:33:20.191887 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.191923 kubelet[2767]: E0209 20:33:20.191894 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192031 kubelet[2767]: E0209 20:33:20.191973 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192031 kubelet[2767]: W0209 20:33:20.191978 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192031 kubelet[2767]: E0209 20:33:20.191986 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192133 kubelet[2767]: E0209 20:33:20.192080 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192133 kubelet[2767]: W0209 20:33:20.192085 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192133 kubelet[2767]: E0209 20:33:20.192092 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192230 kubelet[2767]: E0209 20:33:20.192166 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192230 kubelet[2767]: W0209 20:33:20.192170 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192230 kubelet[2767]: E0209 20:33:20.192178 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192333 kubelet[2767]: E0209 20:33:20.192252 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192333 kubelet[2767]: W0209 20:33:20.192257 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192333 kubelet[2767]: E0209 20:33:20.192264 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192478 kubelet[2767]: E0209 20:33:20.192336 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192478 kubelet[2767]: W0209 20:33:20.192349 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192478 kubelet[2767]: E0209 20:33:20.192358 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192478 kubelet[2767]: E0209 20:33:20.192450 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192478 kubelet[2767]: W0209 20:33:20.192458 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192478 kubelet[2767]: E0209 20:33:20.192471 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192877 kubelet[2767]: E0209 20:33:20.192610 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192877 kubelet[2767]: W0209 20:33:20.192619 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192877 kubelet[2767]: E0209 20:33:20.192629 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192877 kubelet[2767]: E0209 20:33:20.192719 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192877 kubelet[2767]: W0209 20:33:20.192724 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192877 kubelet[2767]: E0209 20:33:20.192732 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.192877 kubelet[2767]: E0209 20:33:20.192847 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.192877 kubelet[2767]: W0209 20:33:20.192855 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.192877 kubelet[2767]: E0209 20:33:20.192863 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193163 kubelet[2767]: E0209 20:33:20.193008 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193163 kubelet[2767]: W0209 20:33:20.193015 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193163 kubelet[2767]: E0209 20:33:20.193024 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193163 kubelet[2767]: E0209 20:33:20.193117 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193163 kubelet[2767]: W0209 20:33:20.193123 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193163 kubelet[2767]: E0209 20:33:20.193133 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193375 kubelet[2767]: E0209 20:33:20.193222 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193375 kubelet[2767]: W0209 20:33:20.193227 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193375 kubelet[2767]: E0209 20:33:20.193234 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193375 kubelet[2767]: E0209 20:33:20.193310 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193375 kubelet[2767]: W0209 20:33:20.193317 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193375 kubelet[2767]: E0209 20:33:20.193326 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193573 kubelet[2767]: E0209 20:33:20.193406 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193573 kubelet[2767]: W0209 20:33:20.193411 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193573 kubelet[2767]: E0209 20:33:20.193420 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193573 kubelet[2767]: E0209 20:33:20.193502 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193573 kubelet[2767]: W0209 20:33:20.193507 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193573 kubelet[2767]: E0209 20:33:20.193515 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193862 kubelet[2767]: E0209 20:33:20.193612 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193862 kubelet[2767]: W0209 20:33:20.193617 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193862 kubelet[2767]: E0209 20:33:20.193625 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193862 kubelet[2767]: E0209 20:33:20.193724 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193862 kubelet[2767]: W0209 20:33:20.193731 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193862 kubelet[2767]: E0209 20:33:20.193743 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.193862 kubelet[2767]: E0209 20:33:20.193843 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.193862 kubelet[2767]: W0209 20:33:20.193848 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.193862 kubelet[2767]: E0209 20:33:20.193855 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194149 kubelet[2767]: E0209 20:33:20.193968 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194149 kubelet[2767]: W0209 20:33:20.193973 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194149 kubelet[2767]: E0209 20:33:20.193980 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194149 kubelet[2767]: E0209 20:33:20.194074 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194149 kubelet[2767]: W0209 20:33:20.194078 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194149 kubelet[2767]: E0209 20:33:20.194086 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194351 kubelet[2767]: E0209 20:33:20.194238 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194351 kubelet[2767]: W0209 20:33:20.194246 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194351 kubelet[2767]: E0209 20:33:20.194255 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194459 kubelet[2767]: E0209 20:33:20.194411 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194459 kubelet[2767]: W0209 20:33:20.194417 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194459 kubelet[2767]: E0209 20:33:20.194444 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194560 kubelet[2767]: E0209 20:33:20.194493 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194560 kubelet[2767]: W0209 20:33:20.194498 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194560 kubelet[2767]: E0209 20:33:20.194510 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194747 kubelet[2767]: E0209 20:33:20.194570 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194747 kubelet[2767]: W0209 20:33:20.194575 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194747 kubelet[2767]: E0209 20:33:20.194586 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194747 kubelet[2767]: E0209 20:33:20.194652 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194747 kubelet[2767]: W0209 20:33:20.194659 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194747 kubelet[2767]: E0209 20:33:20.194672 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194934 kubelet[2767]: E0209 20:33:20.194788 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194934 kubelet[2767]: W0209 20:33:20.194793 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194934 kubelet[2767]: E0209 20:33:20.194800 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.194934 kubelet[2767]: E0209 20:33:20.194881 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.194934 kubelet[2767]: W0209 20:33:20.194886 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.194934 kubelet[2767]: E0209 20:33:20.194892 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195132 kubelet[2767]: E0209 20:33:20.195020 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195132 kubelet[2767]: W0209 20:33:20.195029 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195132 kubelet[2767]: E0209 20:33:20.195040 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195132 kubelet[2767]: I0209 20:33:20.195057 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77998\" (UniqueName: \"kubernetes.io/projected/21aae8c4-8c7c-48d6-86a1-b78761bdb569-kube-api-access-77998\") pod \"csi-node-driver-dx2ql\" (UID: \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\") " pod="calico-system/csi-node-driver-dx2ql" Feb 9 20:33:20.195267 kubelet[2767]: E0209 20:33:20.195157 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195267 kubelet[2767]: W0209 20:33:20.195162 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195267 kubelet[2767]: E0209 20:33:20.195173 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195267 kubelet[2767]: E0209 20:33:20.195247 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195267 kubelet[2767]: W0209 20:33:20.195252 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195267 kubelet[2767]: E0209 20:33:20.195260 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195475 kubelet[2767]: E0209 20:33:20.195337 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195475 kubelet[2767]: W0209 20:33:20.195351 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195475 kubelet[2767]: E0209 20:33:20.195360 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195475 kubelet[2767]: E0209 20:33:20.195435 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195475 kubelet[2767]: W0209 20:33:20.195440 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195475 kubelet[2767]: E0209 20:33:20.195448 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195728 kubelet[2767]: E0209 20:33:20.195519 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195728 kubelet[2767]: W0209 20:33:20.195524 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195728 kubelet[2767]: E0209 20:33:20.195531 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195728 kubelet[2767]: E0209 20:33:20.195656 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195728 kubelet[2767]: W0209 20:33:20.195660 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195728 kubelet[2767]: E0209 20:33:20.195671 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195910 kubelet[2767]: E0209 20:33:20.195743 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195910 kubelet[2767]: W0209 20:33:20.195747 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195910 kubelet[2767]: E0209 20:33:20.195755 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.195910 kubelet[2767]: E0209 20:33:20.195818 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.195910 kubelet[2767]: W0209 20:33:20.195823 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.195910 kubelet[2767]: E0209 20:33:20.195830 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.196095 kubelet[2767]: E0209 20:33:20.195951 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.196095 kubelet[2767]: W0209 20:33:20.195955 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.196095 kubelet[2767]: E0209 20:33:20.195963 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.196095 kubelet[2767]: E0209 20:33:20.196077 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.196095 kubelet[2767]: W0209 20:33:20.196082 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.196095 kubelet[2767]: E0209 20:33:20.196090 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.196291 kubelet[2767]: E0209 20:33:20.196195 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.196291 kubelet[2767]: W0209 20:33:20.196200 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.196291 kubelet[2767]: E0209 20:33:20.196206 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.261937 env[1563]: time="2024-02-09T20:33:20.261885722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8546f8997d-f95j4,Uid:a029d566-7677-4b14-8b19-30eeaad7a78d,Namespace:calico-system,Attempt:0,}" Feb 9 20:33:20.268678 env[1563]: time="2024-02-09T20:33:20.268648099Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:33:20.268678 env[1563]: time="2024-02-09T20:33:20.268667899Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:33:20.268766 env[1563]: time="2024-02-09T20:33:20.268677758Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:33:20.268766 env[1563]: time="2024-02-09T20:33:20.268741057Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f pid=3340 runtime=io.containerd.runc.v2 Feb 9 20:33:20.295598 kubelet[2767]: E0209 20:33:20.295582 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.295598 kubelet[2767]: W0209 20:33:20.295593 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.295721 kubelet[2767]: E0209 20:33:20.295604 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.295763 kubelet[2767]: E0209 20:33:20.295721 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.295763 kubelet[2767]: W0209 20:33:20.295726 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.295763 kubelet[2767]: E0209 20:33:20.295734 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.295865 kubelet[2767]: E0209 20:33:20.295856 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.295865 kubelet[2767]: W0209 20:33:20.295863 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.295922 kubelet[2767]: E0209 20:33:20.295870 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.295966 kubelet[2767]: E0209 20:33:20.295960 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.295966 kubelet[2767]: W0209 20:33:20.295965 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296023 kubelet[2767]: E0209 20:33:20.295971 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296085 kubelet[2767]: E0209 20:33:20.296078 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296085 kubelet[2767]: W0209 20:33:20.296084 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296147 kubelet[2767]: E0209 20:33:20.296093 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296177 kubelet[2767]: E0209 20:33:20.296154 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296177 kubelet[2767]: W0209 20:33:20.296158 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296177 kubelet[2767]: E0209 20:33:20.296164 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296268 kubelet[2767]: E0209 20:33:20.296231 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296268 kubelet[2767]: W0209 20:33:20.296235 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296268 kubelet[2767]: E0209 20:33:20.296242 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296345 kubelet[2767]: E0209 20:33:20.296323 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296345 kubelet[2767]: W0209 20:33:20.296327 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296345 kubelet[2767]: E0209 20:33:20.296334 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296431 kubelet[2767]: E0209 20:33:20.296422 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296431 kubelet[2767]: W0209 20:33:20.296427 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296490 kubelet[2767]: E0209 20:33:20.296435 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296522 kubelet[2767]: E0209 20:33:20.296502 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296522 kubelet[2767]: W0209 20:33:20.296506 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296522 kubelet[2767]: E0209 20:33:20.296513 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296604 kubelet[2767]: E0209 20:33:20.296582 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296604 kubelet[2767]: W0209 20:33:20.296587 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296604 kubelet[2767]: E0209 20:33:20.296593 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296708 kubelet[2767]: E0209 20:33:20.296669 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296708 kubelet[2767]: W0209 20:33:20.296673 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296708 kubelet[2767]: E0209 20:33:20.296678 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296796 kubelet[2767]: E0209 20:33:20.296790 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296796 kubelet[2767]: W0209 20:33:20.296794 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296854 kubelet[2767]: E0209 20:33:20.296801 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296888 kubelet[2767]: E0209 20:33:20.296885 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296917 kubelet[2767]: W0209 20:33:20.296889 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.296917 kubelet[2767]: E0209 20:33:20.296895 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.296973 kubelet[2767]: E0209 20:33:20.296965 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.296973 kubelet[2767]: W0209 20:33:20.296972 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297033 kubelet[2767]: E0209 20:33:20.296981 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297063 kubelet[2767]: E0209 20:33:20.297046 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297063 kubelet[2767]: W0209 20:33:20.297050 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297122 kubelet[2767]: E0209 20:33:20.297071 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297122 kubelet[2767]: E0209 20:33:20.297117 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297122 kubelet[2767]: W0209 20:33:20.297121 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297209 kubelet[2767]: E0209 20:33:20.297139 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297209 kubelet[2767]: E0209 20:33:20.297184 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297209 kubelet[2767]: W0209 20:33:20.297188 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297209 kubelet[2767]: E0209 20:33:20.297195 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297316 kubelet[2767]: E0209 20:33:20.297255 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297316 kubelet[2767]: W0209 20:33:20.297258 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297316 kubelet[2767]: E0209 20:33:20.297264 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297404 kubelet[2767]: E0209 20:33:20.297335 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297404 kubelet[2767]: W0209 20:33:20.297345 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297404 kubelet[2767]: E0209 20:33:20.297352 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297487 kubelet[2767]: E0209 20:33:20.297427 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297487 kubelet[2767]: W0209 20:33:20.297431 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297487 kubelet[2767]: E0209 20:33:20.297437 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297605 kubelet[2767]: E0209 20:33:20.297520 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297605 kubelet[2767]: W0209 20:33:20.297530 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297605 kubelet[2767]: E0209 20:33:20.297544 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297605 kubelet[2767]: E0209 20:33:20.297605 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297708 kubelet[2767]: W0209 20:33:20.297609 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297708 kubelet[2767]: E0209 20:33:20.297616 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297708 kubelet[2767]: E0209 20:33:20.297690 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297708 kubelet[2767]: W0209 20:33:20.297694 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297708 kubelet[2767]: E0209 20:33:20.297699 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297842 kubelet[2767]: E0209 20:33:20.297764 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297842 kubelet[2767]: W0209 20:33:20.297768 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297842 kubelet[2767]: E0209 20:33:20.297773 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.297922 kubelet[2767]: E0209 20:33:20.297888 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.297922 kubelet[2767]: W0209 20:33:20.297893 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.297922 kubelet[2767]: E0209 20:33:20.297898 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.321361 env[1563]: time="2024-02-09T20:33:20.321317361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8546f8997d-f95j4,Uid:a029d566-7677-4b14-8b19-30eeaad7a78d,Namespace:calico-system,Attempt:0,} returns sandbox id \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\"" Feb 9 20:33:20.322195 env[1563]: time="2024-02-09T20:33:20.322180524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\"" Feb 9 20:33:20.397655 kubelet[2767]: E0209 20:33:20.397610 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.397655 kubelet[2767]: W0209 20:33:20.397623 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.397655 kubelet[2767]: E0209 20:33:20.397635 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.397801 kubelet[2767]: E0209 20:33:20.397792 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.397827 kubelet[2767]: W0209 20:33:20.397801 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.397827 kubelet[2767]: E0209 20:33:20.397812 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.498796 kubelet[2767]: E0209 20:33:20.498671 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.498796 kubelet[2767]: W0209 20:33:20.498693 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.498796 kubelet[2767]: E0209 20:33:20.498714 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.498995 kubelet[2767]: E0209 20:33:20.498935 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.498995 kubelet[2767]: W0209 20:33:20.498945 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.498995 kubelet[2767]: E0209 20:33:20.498959 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.564193 kubelet[2767]: E0209 20:33:20.564147 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.564193 kubelet[2767]: W0209 20:33:20.564164 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.564193 kubelet[2767]: E0209 20:33:20.564184 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.592889 env[1563]: time="2024-02-09T20:33:20.592850831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kl8ql,Uid:a03330e2-7655-4766-b4a3-8964354a083e,Namespace:calico-system,Attempt:0,}" Feb 9 20:33:20.600321 kubelet[2767]: E0209 20:33:20.600299 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.600321 kubelet[2767]: W0209 20:33:20.600317 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.600486 kubelet[2767]: E0209 20:33:20.600353 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.603034 env[1563]: time="2024-02-09T20:33:20.602942215Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:33:20.603034 env[1563]: time="2024-02-09T20:33:20.603000434Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:33:20.603034 env[1563]: time="2024-02-09T20:33:20.603016749Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:33:20.603254 env[1563]: time="2024-02-09T20:33:20.603162947Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215 pid=3435 runtime=io.containerd.runc.v2 Feb 9 20:33:20.636212 env[1563]: time="2024-02-09T20:33:20.636162712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kl8ql,Uid:a03330e2-7655-4766-b4a3-8964354a083e,Namespace:calico-system,Attempt:0,} returns sandbox id \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\"" Feb 9 20:33:20.702128 kubelet[2767]: E0209 20:33:20.702061 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.702128 kubelet[2767]: W0209 20:33:20.702102 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.702510 kubelet[2767]: E0209 20:33:20.702150 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:20.774574 kubelet[2767]: E0209 20:33:20.774420 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:20.774574 kubelet[2767]: W0209 20:33:20.774464 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:20.774574 kubelet[2767]: E0209 20:33:20.774524 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:21.048000 audit[3496]: NETFILTER_CFG table=filter:107 family=2 entries=14 op=nft_register_rule pid=3496 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:21.048000 audit[3496]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffcbbb7d210 a2=0 a3=7ffcbbb7d1fc items=0 ppid=3048 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:21.048000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:21.049000 audit[3496]: NETFILTER_CFG table=nat:108 family=2 entries=20 op=nft_register_rule pid=3496 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:21.049000 audit[3496]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffcbbb7d210 a2=0 a3=7ffcbbb7d1fc items=0 ppid=3048 pid=3496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:21.049000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:21.883888 kubelet[2767]: E0209 20:33:21.883834 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:23.884760 kubelet[2767]: E0209 20:33:23.884667 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:24.920259 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2478472756.mount: Deactivated successfully. Feb 9 20:33:25.883927 kubelet[2767]: E0209 20:33:25.883817 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:27.884588 kubelet[2767]: E0209 20:33:27.884496 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:29.884126 kubelet[2767]: E0209 20:33:29.884021 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:30.925079 env[1563]: time="2024-02-09T20:33:30.925027546Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:30.925710 env[1563]: time="2024-02-09T20:33:30.925664493Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:30.926991 env[1563]: time="2024-02-09T20:33:30.926952992Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:30.928010 env[1563]: time="2024-02-09T20:33:30.927967124Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:5f2d3b8c354a4eb6de46e786889913916e620c6c256982fb8d0f1a1d36a282bc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:30.928551 env[1563]: time="2024-02-09T20:33:30.928511197Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\" returns image reference \"sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c\"" Feb 9 20:33:30.928838 env[1563]: time="2024-02-09T20:33:30.928806442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\"" Feb 9 20:33:30.932889 env[1563]: time="2024-02-09T20:33:30.932842935Z" level=info msg="CreateContainer within sandbox \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 9 20:33:30.936820 env[1563]: time="2024-02-09T20:33:30.936806154Z" level=info msg="CreateContainer within sandbox \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\"" Feb 9 20:33:30.937066 env[1563]: time="2024-02-09T20:33:30.937017489Z" level=info msg="StartContainer for \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\"" Feb 9 20:33:30.939774 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2206330173.mount: Deactivated successfully. Feb 9 20:33:30.990771 env[1563]: time="2024-02-09T20:33:30.990720676Z" level=info msg="StartContainer for \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\" returns successfully" Feb 9 20:33:31.884760 kubelet[2767]: E0209 20:33:31.884688 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:31.964265 env[1563]: time="2024-02-09T20:33:31.964128503Z" level=info msg="StopContainer for \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\" with timeout 300 (s)" Feb 9 20:33:31.965187 env[1563]: time="2024-02-09T20:33:31.964871430Z" level=info msg="Stop container \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\" with signal terminated" Feb 9 20:33:31.979179 kubelet[2767]: I0209 20:33:31.979152 2767 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-8546f8997d-f95j4" podStartSLOduration=-9.223372023875675e+09 pod.CreationTimestamp="2024-02-09 20:33:19 +0000 UTC" firstStartedPulling="2024-02-09 20:33:20.322009196 +0000 UTC m=+17.630437482" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-09 20:33:31.9785498 +0000 UTC m=+29.286978084" watchObservedRunningTime="2024-02-09 20:33:31.979100268 +0000 UTC m=+29.287528550" Feb 9 20:33:32.012175 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87-rootfs.mount: Deactivated successfully. Feb 9 20:33:32.350606 env[1563]: time="2024-02-09T20:33:32.350494135Z" level=info msg="shim disconnected" id=9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87 Feb 9 20:33:32.350606 env[1563]: time="2024-02-09T20:33:32.350604918Z" level=warning msg="cleaning up after shim disconnected" id=9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87 namespace=k8s.io Feb 9 20:33:32.351166 env[1563]: time="2024-02-09T20:33:32.350637899Z" level=info msg="cleaning up dead shim" Feb 9 20:33:32.379919 env[1563]: time="2024-02-09T20:33:32.379796179Z" level=warning msg="cleanup warnings time=\"2024-02-09T20:33:32Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3570 runtime=io.containerd.runc.v2\n" Feb 9 20:33:32.382122 env[1563]: time="2024-02-09T20:33:32.382006936Z" level=info msg="StopContainer for \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\" returns successfully" Feb 9 20:33:32.383418 env[1563]: time="2024-02-09T20:33:32.383286517Z" level=info msg="StopPodSandbox for \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\"" Feb 9 20:33:32.383629 env[1563]: time="2024-02-09T20:33:32.383450167Z" level=info msg="Container to stop \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Feb 9 20:33:32.390743 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f-shm.mount: Deactivated successfully. Feb 9 20:33:32.427661 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f-rootfs.mount: Deactivated successfully. Feb 9 20:33:32.427870 env[1563]: time="2024-02-09T20:33:32.427684821Z" level=info msg="shim disconnected" id=400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f Feb 9 20:33:32.427870 env[1563]: time="2024-02-09T20:33:32.427736912Z" level=warning msg="cleaning up after shim disconnected" id=400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f namespace=k8s.io Feb 9 20:33:32.427870 env[1563]: time="2024-02-09T20:33:32.427756742Z" level=info msg="cleaning up dead shim" Feb 9 20:33:32.433552 env[1563]: time="2024-02-09T20:33:32.433521432Z" level=warning msg="cleanup warnings time=\"2024-02-09T20:33:32Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3602 runtime=io.containerd.runc.v2\n" Feb 9 20:33:32.433796 env[1563]: time="2024-02-09T20:33:32.433768241Z" level=info msg="TearDown network for sandbox \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\" successfully" Feb 9 20:33:32.433796 env[1563]: time="2024-02-09T20:33:32.433789272Z" level=info msg="StopPodSandbox for \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\" returns successfully" Feb 9 20:33:32.484233 kubelet[2767]: E0209 20:33:32.484124 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:32.484233 kubelet[2767]: W0209 20:33:32.484171 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:32.484233 kubelet[2767]: E0209 20:33:32.484217 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:32.484818 kubelet[2767]: I0209 20:33:32.484307 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a029d566-7677-4b14-8b19-30eeaad7a78d-typha-certs\") pod \"a029d566-7677-4b14-8b19-30eeaad7a78d\" (UID: \"a029d566-7677-4b14-8b19-30eeaad7a78d\") " Feb 9 20:33:32.484967 kubelet[2767]: E0209 20:33:32.484862 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:32.484967 kubelet[2767]: W0209 20:33:32.484895 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:32.484967 kubelet[2767]: E0209 20:33:32.484942 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:32.485282 kubelet[2767]: I0209 20:33:32.485011 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a029d566-7677-4b14-8b19-30eeaad7a78d-tigera-ca-bundle\") pod \"a029d566-7677-4b14-8b19-30eeaad7a78d\" (UID: \"a029d566-7677-4b14-8b19-30eeaad7a78d\") " Feb 9 20:33:32.485663 kubelet[2767]: E0209 20:33:32.485580 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:32.485663 kubelet[2767]: W0209 20:33:32.485636 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:32.486009 kubelet[2767]: E0209 20:33:32.485703 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:32.486354 kubelet[2767]: E0209 20:33:32.486300 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:32.486558 kubelet[2767]: W0209 20:33:32.486337 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:32.486558 kubelet[2767]: E0209 20:33:32.486407 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:32.486558 kubelet[2767]: I0209 20:33:32.486484 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtj9t\" (UniqueName: \"kubernetes.io/projected/a029d566-7677-4b14-8b19-30eeaad7a78d-kube-api-access-wtj9t\") pod \"a029d566-7677-4b14-8b19-30eeaad7a78d\" (UID: \"a029d566-7677-4b14-8b19-30eeaad7a78d\") " Feb 9 20:33:32.487245 kubelet[2767]: E0209 20:33:32.487199 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:32.487245 kubelet[2767]: W0209 20:33:32.487241 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:32.487717 kubelet[2767]: E0209 20:33:32.487300 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:32.492777 kubelet[2767]: I0209 20:33:32.492658 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a029d566-7677-4b14-8b19-30eeaad7a78d-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "a029d566-7677-4b14-8b19-30eeaad7a78d" (UID: "a029d566-7677-4b14-8b19-30eeaad7a78d"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 9 20:33:32.496509 kubelet[2767]: I0209 20:33:32.496425 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a029d566-7677-4b14-8b19-30eeaad7a78d-kube-api-access-wtj9t" (OuterVolumeSpecName: "kube-api-access-wtj9t") pod "a029d566-7677-4b14-8b19-30eeaad7a78d" (UID: "a029d566-7677-4b14-8b19-30eeaad7a78d"). InnerVolumeSpecName "kube-api-access-wtj9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 9 20:33:32.498148 kubelet[2767]: E0209 20:33:32.498058 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:32.498148 kubelet[2767]: W0209 20:33:32.498110 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:32.498510 kubelet[2767]: E0209 20:33:32.498179 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:32.498637 kubelet[2767]: W0209 20:33:32.498602 2767 empty_dir.go:525] Warning: Failed to clear quota on /var/lib/kubelet/pods/a029d566-7677-4b14-8b19-30eeaad7a78d/volumes/kubernetes.io~configmap/tigera-ca-bundle: clearQuota called, but quotas disabled Feb 9 20:33:32.498721 systemd[1]: var-lib-kubelet-pods-a029d566\x2d7677\x2d4b14\x2d8b19\x2d30eeaad7a78d-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Feb 9 20:33:32.499292 systemd[1]: var-lib-kubelet-pods-a029d566\x2d7677\x2d4b14\x2d8b19\x2d30eeaad7a78d-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Feb 9 20:33:32.499609 kubelet[2767]: I0209 20:33:32.499374 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a029d566-7677-4b14-8b19-30eeaad7a78d-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "a029d566-7677-4b14-8b19-30eeaad7a78d" (UID: "a029d566-7677-4b14-8b19-30eeaad7a78d"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 9 20:33:32.506913 systemd[1]: var-lib-kubelet-pods-a029d566\x2d7677\x2d4b14\x2d8b19\x2d30eeaad7a78d-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dwtj9t.mount: Deactivated successfully. Feb 9 20:33:32.587775 kubelet[2767]: I0209 20:33:32.587669 2767 reconciler_common.go:295] "Volume detached for volume \"kube-api-access-wtj9t\" (UniqueName: \"kubernetes.io/projected/a029d566-7677-4b14-8b19-30eeaad7a78d-kube-api-access-wtj9t\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:32.587775 kubelet[2767]: I0209 20:33:32.587747 2767 reconciler_common.go:295] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a029d566-7677-4b14-8b19-30eeaad7a78d-typha-certs\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:32.587775 kubelet[2767]: I0209 20:33:32.587783 2767 reconciler_common.go:295] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a029d566-7677-4b14-8b19-30eeaad7a78d-tigera-ca-bundle\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:32.968629 kubelet[2767]: I0209 20:33:32.968573 2767 scope.go:115] "RemoveContainer" containerID="9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87" Feb 9 20:33:32.971908 env[1563]: time="2024-02-09T20:33:32.971704745Z" level=info msg="RemoveContainer for \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\"" Feb 9 20:33:32.974014 env[1563]: time="2024-02-09T20:33:32.973981744Z" level=info msg="RemoveContainer for \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\" returns successfully" Feb 9 20:33:32.974102 kubelet[2767]: I0209 20:33:32.974093 2767 scope.go:115] "RemoveContainer" containerID="9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87" Feb 9 20:33:32.974211 env[1563]: time="2024-02-09T20:33:32.974170415Z" level=error msg="ContainerStatus for \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\": not found" Feb 9 20:33:32.974273 kubelet[2767]: E0209 20:33:32.974268 2767 remote_runtime.go:415] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\": not found" containerID="9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87" Feb 9 20:33:32.974298 kubelet[2767]: I0209 20:33:32.974286 2767 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={Type:containerd ID:9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87} err="failed to get container status \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\": rpc error: code = NotFound desc = an error occurred when try to find container \"9eb2a7bfc6c5efbae181c2554b0cd05b997ddd0606b4db5329d8879698e11a87\": not found" Feb 9 20:33:32.986634 kubelet[2767]: I0209 20:33:32.986584 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:32.986634 kubelet[2767]: E0209 20:33:32.986635 2767 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="a029d566-7677-4b14-8b19-30eeaad7a78d" containerName="calico-typha" Feb 9 20:33:32.986781 kubelet[2767]: I0209 20:33:32.986665 2767 memory_manager.go:346] "RemoveStaleState removing state" podUID="a029d566-7677-4b14-8b19-30eeaad7a78d" containerName="calico-typha" Feb 9 20:33:33.006000 audit[3664]: NETFILTER_CFG table=filter:109 family=2 entries=14 op=nft_register_rule pid=3664 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:33.033770 kernel: kauditd_printk_skb: 140 callbacks suppressed Feb 9 20:33:33.033831 kernel: audit: type=1325 audit(1707510813.006:280): table=filter:109 family=2 entries=14 op=nft_register_rule pid=3664 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:33.077257 kubelet[2767]: E0209 20:33:33.077220 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.077257 kubelet[2767]: W0209 20:33:33.077229 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.077257 kubelet[2767]: E0209 20:33:33.077240 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.077356 kubelet[2767]: E0209 20:33:33.077351 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.077356 kubelet[2767]: W0209 20:33:33.077355 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.077396 kubelet[2767]: E0209 20:33:33.077361 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.077513 kubelet[2767]: E0209 20:33:33.077473 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.077513 kubelet[2767]: W0209 20:33:33.077479 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.077513 kubelet[2767]: E0209 20:33:33.077486 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.077676 kubelet[2767]: E0209 20:33:33.077638 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.077676 kubelet[2767]: W0209 20:33:33.077645 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.077676 kubelet[2767]: E0209 20:33:33.077652 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.077800 kubelet[2767]: E0209 20:33:33.077757 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.077800 kubelet[2767]: W0209 20:33:33.077763 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.077800 kubelet[2767]: E0209 20:33:33.077770 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.077877 kubelet[2767]: E0209 20:33:33.077870 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.077877 kubelet[2767]: W0209 20:33:33.077876 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.077916 kubelet[2767]: E0209 20:33:33.077883 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.078007 kubelet[2767]: E0209 20:33:33.077973 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.078007 kubelet[2767]: W0209 20:33:33.077978 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.078007 kubelet[2767]: E0209 20:33:33.077983 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.078081 kubelet[2767]: E0209 20:33:33.078040 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.078081 kubelet[2767]: W0209 20:33:33.078044 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.078081 kubelet[2767]: E0209 20:33:33.078050 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.078139 kubelet[2767]: E0209 20:33:33.078105 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.078139 kubelet[2767]: W0209 20:33:33.078109 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.078139 kubelet[2767]: E0209 20:33:33.078116 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.090389 kubelet[2767]: E0209 20:33:33.090352 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.090389 kubelet[2767]: W0209 20:33:33.090358 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.090389 kubelet[2767]: E0209 20:33:33.090364 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.090389 kubelet[2767]: I0209 20:33:33.090379 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0c6e5140-63eb-49c7-90f3-4cc4ca8b5da5-typha-certs\") pod \"calico-typha-68d78bd549-2hbgc\" (UID: \"0c6e5140-63eb-49c7-90f3-4cc4ca8b5da5\") " pod="calico-system/calico-typha-68d78bd549-2hbgc" Feb 9 20:33:33.090496 kubelet[2767]: E0209 20:33:33.090485 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.090496 kubelet[2767]: W0209 20:33:33.090492 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.090539 kubelet[2767]: E0209 20:33:33.090501 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.090539 kubelet[2767]: I0209 20:33:33.090514 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjhxx\" (UniqueName: \"kubernetes.io/projected/0c6e5140-63eb-49c7-90f3-4cc4ca8b5da5-kube-api-access-pjhxx\") pod \"calico-typha-68d78bd549-2hbgc\" (UID: \"0c6e5140-63eb-49c7-90f3-4cc4ca8b5da5\") " pod="calico-system/calico-typha-68d78bd549-2hbgc" Feb 9 20:33:33.090685 kubelet[2767]: E0209 20:33:33.090650 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.090685 kubelet[2767]: W0209 20:33:33.090656 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.090685 kubelet[2767]: E0209 20:33:33.090664 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.090808 kubelet[2767]: E0209 20:33:33.090768 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.090808 kubelet[2767]: W0209 20:33:33.090774 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.090808 kubelet[2767]: E0209 20:33:33.090781 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.090941 kubelet[2767]: E0209 20:33:33.090896 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.090941 kubelet[2767]: W0209 20:33:33.090903 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.090941 kubelet[2767]: E0209 20:33:33.090910 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.091013 kubelet[2767]: E0209 20:33:33.091002 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.091013 kubelet[2767]: W0209 20:33:33.091006 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.091013 kubelet[2767]: E0209 20:33:33.091012 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.091118 kubelet[2767]: E0209 20:33:33.091085 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.091118 kubelet[2767]: W0209 20:33:33.091090 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.091118 kubelet[2767]: E0209 20:33:33.091095 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.091118 kubelet[2767]: I0209 20:33:33.091106 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6e5140-63eb-49c7-90f3-4cc4ca8b5da5-tigera-ca-bundle\") pod \"calico-typha-68d78bd549-2hbgc\" (UID: \"0c6e5140-63eb-49c7-90f3-4cc4ca8b5da5\") " pod="calico-system/calico-typha-68d78bd549-2hbgc" Feb 9 20:33:33.091196 kubelet[2767]: E0209 20:33:33.091175 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.091196 kubelet[2767]: W0209 20:33:33.091180 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.091196 kubelet[2767]: E0209 20:33:33.091186 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.091250 kubelet[2767]: E0209 20:33:33.091242 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.091250 kubelet[2767]: W0209 20:33:33.091246 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.091287 kubelet[2767]: E0209 20:33:33.091251 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.006000 audit[3664]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffd3ee008e0 a2=0 a3=7ffd3ee008cc items=0 ppid=3048 pid=3664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:33.190120 kernel: audit: type=1300 audit(1707510813.006:280): arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffd3ee008e0 a2=0 a3=7ffd3ee008cc items=0 ppid=3048 pid=3664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:33.190157 kernel: audit: type=1327 audit(1707510813.006:280): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:33.006000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:33.191399 kubelet[2767]: E0209 20:33:33.191387 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.191435 kubelet[2767]: W0209 20:33:33.191400 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.191435 kubelet[2767]: E0209 20:33:33.191417 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.191637 kubelet[2767]: E0209 20:33:33.191600 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.191637 kubelet[2767]: W0209 20:33:33.191611 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.191637 kubelet[2767]: E0209 20:33:33.191629 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.191799 kubelet[2767]: E0209 20:33:33.191763 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.191799 kubelet[2767]: W0209 20:33:33.191769 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.191799 kubelet[2767]: E0209 20:33:33.191779 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.191871 kubelet[2767]: E0209 20:33:33.191850 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.191871 kubelet[2767]: W0209 20:33:33.191853 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.191871 kubelet[2767]: E0209 20:33:33.191859 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.191962 kubelet[2767]: E0209 20:33:33.191929 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.191962 kubelet[2767]: W0209 20:33:33.191933 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.191962 kubelet[2767]: E0209 20:33:33.191940 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192018 kubelet[2767]: E0209 20:33:33.192013 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192018 kubelet[2767]: W0209 20:33:33.192017 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192055 kubelet[2767]: E0209 20:33:33.192023 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192160 kubelet[2767]: E0209 20:33:33.192154 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192181 kubelet[2767]: W0209 20:33:33.192160 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192181 kubelet[2767]: E0209 20:33:33.192169 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192252 kubelet[2767]: E0209 20:33:33.192247 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192274 kubelet[2767]: W0209 20:33:33.192252 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192274 kubelet[2767]: E0209 20:33:33.192259 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192324 kubelet[2767]: E0209 20:33:33.192319 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192351 kubelet[2767]: W0209 20:33:33.192324 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192351 kubelet[2767]: E0209 20:33:33.192330 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192467 kubelet[2767]: E0209 20:33:33.192432 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192467 kubelet[2767]: W0209 20:33:33.192438 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192467 kubelet[2767]: E0209 20:33:33.192448 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192540 kubelet[2767]: E0209 20:33:33.192537 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192559 kubelet[2767]: W0209 20:33:33.192541 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192559 kubelet[2767]: E0209 20:33:33.192548 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192676 kubelet[2767]: E0209 20:33:33.192669 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192676 kubelet[2767]: W0209 20:33:33.192675 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192735 kubelet[2767]: E0209 20:33:33.192684 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192815 kubelet[2767]: E0209 20:33:33.192769 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192815 kubelet[2767]: W0209 20:33:33.192776 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192815 kubelet[2767]: E0209 20:33:33.192785 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192873 kubelet[2767]: E0209 20:33:33.192859 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192873 kubelet[2767]: W0209 20:33:33.192864 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192873 kubelet[2767]: E0209 20:33:33.192873 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.192946 kubelet[2767]: E0209 20:33:33.192941 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.192966 kubelet[2767]: W0209 20:33:33.192946 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.192966 kubelet[2767]: E0209 20:33:33.192955 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.193040 kubelet[2767]: E0209 20:33:33.193035 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.193062 kubelet[2767]: W0209 20:33:33.193040 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.193062 kubelet[2767]: E0209 20:33:33.193049 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.194810 kubelet[2767]: E0209 20:33:33.194796 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.194849 kubelet[2767]: W0209 20:33:33.194807 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.194849 kubelet[2767]: E0209 20:33:33.194824 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.198940 kubelet[2767]: E0209 20:33:33.198901 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.198940 kubelet[2767]: W0209 20:33:33.198912 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.198940 kubelet[2767]: E0209 20:33:33.198926 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.007000 audit[3664]: NETFILTER_CFG table=nat:110 family=2 entries=20 op=nft_register_rule pid=3664 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:33.289156 env[1563]: time="2024-02-09T20:33:33.289134884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-68d78bd549-2hbgc,Uid:0c6e5140-63eb-49c7-90f3-4cc4ca8b5da5,Namespace:calico-system,Attempt:0,}" Feb 9 20:33:33.294848 env[1563]: time="2024-02-09T20:33:33.294815726Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:33:33.294848 env[1563]: time="2024-02-09T20:33:33.294837032Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:33:33.294961 env[1563]: time="2024-02-09T20:33:33.294848623Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:33:33.294961 env[1563]: time="2024-02-09T20:33:33.294930741Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/29f91f6729fa4d67e8db2ab2e2f2f2644f4e61a3bbeae78a09dcafc7b05bb2d3 pid=3736 runtime=io.containerd.runc.v2 Feb 9 20:33:33.007000 audit[3664]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffd3ee008e0 a2=0 a3=7ffd3ee008cc items=0 ppid=3048 pid=3664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:33.417423 kernel: audit: type=1325 audit(1707510813.007:281): table=nat:110 family=2 entries=20 op=nft_register_rule pid=3664 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:33.417478 kernel: audit: type=1300 audit(1707510813.007:281): arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffd3ee008e0 a2=0 a3=7ffd3ee008cc items=0 ppid=3048 pid=3664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:33.417498 kernel: audit: type=1327 audit(1707510813.007:281): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:33.007000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:33.276000 audit[3728]: NETFILTER_CFG table=filter:111 family=2 entries=14 op=nft_register_rule pid=3728 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:33.493585 env[1563]: time="2024-02-09T20:33:33.493556999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-68d78bd549-2hbgc,Uid:0c6e5140-63eb-49c7-90f3-4cc4ca8b5da5,Namespace:calico-system,Attempt:0,} returns sandbox id \"29f91f6729fa4d67e8db2ab2e2f2f2644f4e61a3bbeae78a09dcafc7b05bb2d3\"" Feb 9 20:33:33.497322 env[1563]: time="2024-02-09T20:33:33.497305663Z" level=info msg="CreateContainer within sandbox \"29f91f6729fa4d67e8db2ab2e2f2f2644f4e61a3bbeae78a09dcafc7b05bb2d3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 9 20:33:33.276000 audit[3728]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffcd0482d70 a2=0 a3=7ffcd0482d5c items=0 ppid=3048 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:33.646659 kernel: audit: type=1325 audit(1707510813.276:282): table=filter:111 family=2 entries=14 op=nft_register_rule pid=3728 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:33.646719 kernel: audit: type=1300 audit(1707510813.276:282): arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffcd0482d70 a2=0 a3=7ffcd0482d5c items=0 ppid=3048 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:33.646735 kernel: audit: type=1327 audit(1707510813.276:282): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:33.276000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:33.648266 env[1563]: time="2024-02-09T20:33:33.648216607Z" level=info msg="CreateContainer within sandbox \"29f91f6729fa4d67e8db2ab2e2f2f2644f4e61a3bbeae78a09dcafc7b05bb2d3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"773d1e0bf463755ace967307592e5b8e7449fba7d9ccce622ab27ef4abe59554\"" Feb 9 20:33:33.648413 env[1563]: time="2024-02-09T20:33:33.648398271Z" level=info msg="StartContainer for \"773d1e0bf463755ace967307592e5b8e7449fba7d9ccce622ab27ef4abe59554\"" Feb 9 20:33:33.713000 audit[3728]: NETFILTER_CFG table=nat:112 family=2 entries=20 op=nft_register_rule pid=3728 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:33.713000 audit[3728]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffcd0482d70 a2=0 a3=7ffcd0482d5c items=0 ppid=3048 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:33.773411 kernel: audit: type=1325 audit(1707510813.713:283): table=nat:112 family=2 entries=20 op=nft_register_rule pid=3728 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:33.713000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:33.802593 env[1563]: time="2024-02-09T20:33:33.802540112Z" level=info msg="StartContainer for \"773d1e0bf463755ace967307592e5b8e7449fba7d9ccce622ab27ef4abe59554\" returns successfully" Feb 9 20:33:33.883980 kubelet[2767]: E0209 20:33:33.883876 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:33.985560 kubelet[2767]: E0209 20:33:33.985365 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.985560 kubelet[2767]: W0209 20:33:33.985422 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.985560 kubelet[2767]: E0209 20:33:33.985493 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.986675 kubelet[2767]: E0209 20:33:33.986083 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.986675 kubelet[2767]: W0209 20:33:33.986123 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.986675 kubelet[2767]: E0209 20:33:33.986181 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.987066 kubelet[2767]: E0209 20:33:33.986711 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.987066 kubelet[2767]: W0209 20:33:33.986747 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.987066 kubelet[2767]: E0209 20:33:33.986803 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.987541 kubelet[2767]: E0209 20:33:33.987461 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.987541 kubelet[2767]: W0209 20:33:33.987500 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.987885 kubelet[2767]: E0209 20:33:33.987555 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.988100 kubelet[2767]: E0209 20:33:33.988040 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.988100 kubelet[2767]: W0209 20:33:33.988078 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.988336 kubelet[2767]: E0209 20:33:33.988134 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.988728 kubelet[2767]: E0209 20:33:33.988638 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.988728 kubelet[2767]: W0209 20:33:33.988676 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.988728 kubelet[2767]: E0209 20:33:33.988731 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.989510 kubelet[2767]: E0209 20:33:33.989434 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.989510 kubelet[2767]: W0209 20:33:33.989472 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.989848 kubelet[2767]: E0209 20:33:33.989527 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.990101 kubelet[2767]: E0209 20:33:33.990028 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.990101 kubelet[2767]: W0209 20:33:33.990065 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.990379 kubelet[2767]: E0209 20:33:33.990118 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.990660 kubelet[2767]: E0209 20:33:33.990627 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.990775 kubelet[2767]: W0209 20:33:33.990666 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.990775 kubelet[2767]: E0209 20:33:33.990725 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.991362 kubelet[2767]: E0209 20:33:33.991280 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.991362 kubelet[2767]: W0209 20:33:33.991316 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.991613 kubelet[2767]: E0209 20:33:33.991384 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.991937 kubelet[2767]: E0209 20:33:33.991903 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.992094 kubelet[2767]: W0209 20:33:33.991940 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.992094 kubelet[2767]: E0209 20:33:33.991994 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.992536 kubelet[2767]: E0209 20:33:33.992497 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.992688 kubelet[2767]: W0209 20:33:33.992536 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.992688 kubelet[2767]: E0209 20:33:33.992592 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.997027 kubelet[2767]: E0209 20:33:33.996986 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.997027 kubelet[2767]: W0209 20:33:33.997023 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.997297 kubelet[2767]: E0209 20:33:33.997067 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.997698 kubelet[2767]: E0209 20:33:33.997664 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.997830 kubelet[2767]: W0209 20:33:33.997697 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.997830 kubelet[2767]: E0209 20:33:33.997744 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.998305 kubelet[2767]: E0209 20:33:33.998269 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.998499 kubelet[2767]: W0209 20:33:33.998305 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.998499 kubelet[2767]: E0209 20:33:33.998386 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.999010 kubelet[2767]: E0209 20:33:33.998921 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.999010 kubelet[2767]: W0209 20:33:33.998959 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.999010 kubelet[2767]: E0209 20:33:33.999009 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:33.999572 kubelet[2767]: E0209 20:33:33.999482 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:33.999572 kubelet[2767]: W0209 20:33:33.999509 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:33.999869 kubelet[2767]: E0209 20:33:33.999623 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.000010 kubelet[2767]: E0209 20:33:33.999977 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.000145 kubelet[2767]: W0209 20:33:34.000010 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.000287 kubelet[2767]: E0209 20:33:34.000149 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.000648 kubelet[2767]: E0209 20:33:34.000565 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.000648 kubelet[2767]: W0209 20:33:34.000607 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.001010 kubelet[2767]: E0209 20:33:34.000734 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.001161 kubelet[2767]: E0209 20:33:34.001125 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.001278 kubelet[2767]: W0209 20:33:34.001161 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.001278 kubelet[2767]: E0209 20:33:34.001217 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.001984 kubelet[2767]: E0209 20:33:34.001898 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.001984 kubelet[2767]: W0209 20:33:34.001944 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.002325 kubelet[2767]: E0209 20:33:34.002015 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.003031 kubelet[2767]: E0209 20:33:34.002976 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.003031 kubelet[2767]: W0209 20:33:34.003010 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.003539 kubelet[2767]: E0209 20:33:34.003090 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.003698 kubelet[2767]: E0209 20:33:34.003643 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.003698 kubelet[2767]: W0209 20:33:34.003674 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.003931 kubelet[2767]: E0209 20:33:34.003758 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.003931 kubelet[2767]: I0209 20:33:34.003814 2767 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-68d78bd549-2hbgc" podStartSLOduration=14.003727093 pod.CreationTimestamp="2024-02-09 20:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-09 20:33:34.00283068 +0000 UTC m=+31.311259034" watchObservedRunningTime="2024-02-09 20:33:34.003727093 +0000 UTC m=+31.312155428" Feb 9 20:33:34.004161 kubelet[2767]: E0209 20:33:34.004132 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.004313 kubelet[2767]: W0209 20:33:34.004160 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.004313 kubelet[2767]: E0209 20:33:34.004267 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.004774 kubelet[2767]: E0209 20:33:34.004691 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.004774 kubelet[2767]: W0209 20:33:34.004737 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.005105 kubelet[2767]: E0209 20:33:34.004823 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.005266 kubelet[2767]: E0209 20:33:34.005226 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.005454 kubelet[2767]: W0209 20:33:34.005276 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.005454 kubelet[2767]: E0209 20:33:34.005367 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.005954 kubelet[2767]: E0209 20:33:34.005878 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.005954 kubelet[2767]: W0209 20:33:34.005906 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.005954 kubelet[2767]: E0209 20:33:34.005948 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.006674 kubelet[2767]: E0209 20:33:34.006593 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.006674 kubelet[2767]: W0209 20:33:34.006629 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.006674 kubelet[2767]: E0209 20:33:34.006676 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.007323 kubelet[2767]: E0209 20:33:34.007289 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.007488 kubelet[2767]: W0209 20:33:34.007324 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.007488 kubelet[2767]: E0209 20:33:34.007396 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.007907 kubelet[2767]: E0209 20:33:34.007862 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:34.007907 kubelet[2767]: W0209 20:33:34.007901 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:34.008252 kubelet[2767]: E0209 20:33:34.007956 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:34.122000 audit[3883]: NETFILTER_CFG table=filter:113 family=2 entries=13 op=nft_register_rule pid=3883 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:34.122000 audit[3883]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7fff09cc11e0 a2=0 a3=7fff09cc11cc items=0 ppid=3048 pid=3883 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:34.122000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:34.124000 audit[3883]: NETFILTER_CFG table=nat:114 family=2 entries=27 op=nft_register_chain pid=3883 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:33:34.124000 audit[3883]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7fff09cc11e0 a2=0 a3=7fff09cc11cc items=0 ppid=3048 pid=3883 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:33:34.124000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:33:34.319666 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount660448593.mount: Deactivated successfully. Feb 9 20:33:34.888671 kubelet[2767]: I0209 20:33:34.888582 2767 kubelet_volumes.go:160] "Cleaned up orphaned pod volumes dir" podUID=a029d566-7677-4b14-8b19-30eeaad7a78d path="/var/lib/kubelet/pods/a029d566-7677-4b14-8b19-30eeaad7a78d/volumes" Feb 9 20:33:35.000610 kubelet[2767]: E0209 20:33:35.000510 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.000610 kubelet[2767]: W0209 20:33:35.000551 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.000610 kubelet[2767]: E0209 20:33:35.000595 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.001743 kubelet[2767]: E0209 20:33:35.001107 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.001743 kubelet[2767]: W0209 20:33:35.001140 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.001743 kubelet[2767]: E0209 20:33:35.001180 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.001743 kubelet[2767]: E0209 20:33:35.001730 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.002133 kubelet[2767]: W0209 20:33:35.001762 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.002133 kubelet[2767]: E0209 20:33:35.001802 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.002404 kubelet[2767]: E0209 20:33:35.002371 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.002404 kubelet[2767]: W0209 20:33:35.002398 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.002630 kubelet[2767]: E0209 20:33:35.002434 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.002998 kubelet[2767]: E0209 20:33:35.002917 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.002998 kubelet[2767]: W0209 20:33:35.002949 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.002998 kubelet[2767]: E0209 20:33:35.002988 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.003538 kubelet[2767]: E0209 20:33:35.003505 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.003538 kubelet[2767]: W0209 20:33:35.003532 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.003776 kubelet[2767]: E0209 20:33:35.003566 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.004214 kubelet[2767]: E0209 20:33:35.004141 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.004214 kubelet[2767]: W0209 20:33:35.004175 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.004214 kubelet[2767]: E0209 20:33:35.004213 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.004804 kubelet[2767]: E0209 20:33:35.004724 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.004804 kubelet[2767]: W0209 20:33:35.004757 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.004804 kubelet[2767]: E0209 20:33:35.004795 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.005277 kubelet[2767]: E0209 20:33:35.005249 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.005277 kubelet[2767]: W0209 20:33:35.005274 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.005549 kubelet[2767]: E0209 20:33:35.005310 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.005901 kubelet[2767]: E0209 20:33:35.005817 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.005901 kubelet[2767]: W0209 20:33:35.005850 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.005901 kubelet[2767]: E0209 20:33:35.005894 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.006400 kubelet[2767]: E0209 20:33:35.006357 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.006400 kubelet[2767]: W0209 20:33:35.006383 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.006663 kubelet[2767]: E0209 20:33:35.006415 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.006930 kubelet[2767]: E0209 20:33:35.006898 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.007046 kubelet[2767]: W0209 20:33:35.006932 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.007046 kubelet[2767]: E0209 20:33:35.006972 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.007747 kubelet[2767]: E0209 20:33:35.007672 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.007747 kubelet[2767]: W0209 20:33:35.007707 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.007747 kubelet[2767]: E0209 20:33:35.007746 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.008354 kubelet[2767]: E0209 20:33:35.008314 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.008480 kubelet[2767]: W0209 20:33:35.008362 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.008480 kubelet[2767]: E0209 20:33:35.008419 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.008963 kubelet[2767]: E0209 20:33:35.008895 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.008963 kubelet[2767]: W0209 20:33:35.008920 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.008963 kubelet[2767]: E0209 20:33:35.008959 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.009522 kubelet[2767]: E0209 20:33:35.009447 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.009522 kubelet[2767]: W0209 20:33:35.009483 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.009522 kubelet[2767]: E0209 20:33:35.009530 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.010090 kubelet[2767]: E0209 20:33:35.010021 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.010090 kubelet[2767]: W0209 20:33:35.010056 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.010386 kubelet[2767]: E0209 20:33:35.010174 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.010691 kubelet[2767]: E0209 20:33:35.010600 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.010691 kubelet[2767]: W0209 20:33:35.010634 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.010991 kubelet[2767]: E0209 20:33:35.010774 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.011204 kubelet[2767]: E0209 20:33:35.011177 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.011204 kubelet[2767]: W0209 20:33:35.011203 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.011491 kubelet[2767]: E0209 20:33:35.011289 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.011788 kubelet[2767]: E0209 20:33:35.011698 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.011788 kubelet[2767]: W0209 20:33:35.011731 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.012106 kubelet[2767]: E0209 20:33:35.011870 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.012251 kubelet[2767]: E0209 20:33:35.012224 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.012385 kubelet[2767]: W0209 20:33:35.012249 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.012385 kubelet[2767]: E0209 20:33:35.012292 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.012853 kubelet[2767]: E0209 20:33:35.012799 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.012853 kubelet[2767]: W0209 20:33:35.012834 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.013072 kubelet[2767]: E0209 20:33:35.012962 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.013378 kubelet[2767]: E0209 20:33:35.013325 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.013526 kubelet[2767]: W0209 20:33:35.013376 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.013526 kubelet[2767]: E0209 20:33:35.013503 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.013927 kubelet[2767]: E0209 20:33:35.013874 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.013927 kubelet[2767]: W0209 20:33:35.013909 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.014162 kubelet[2767]: E0209 20:33:35.014019 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.014552 kubelet[2767]: E0209 20:33:35.014484 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.014552 kubelet[2767]: W0209 20:33:35.014516 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.014552 kubelet[2767]: E0209 20:33:35.014560 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.015178 kubelet[2767]: E0209 20:33:35.015144 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.015311 kubelet[2767]: W0209 20:33:35.015179 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.015311 kubelet[2767]: E0209 20:33:35.015226 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.015908 kubelet[2767]: E0209 20:33:35.015834 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.015908 kubelet[2767]: W0209 20:33:35.015868 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.015908 kubelet[2767]: E0209 20:33:35.015914 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.016744 kubelet[2767]: E0209 20:33:35.016670 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.016744 kubelet[2767]: W0209 20:33:35.016706 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.016744 kubelet[2767]: E0209 20:33:35.016755 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.017290 kubelet[2767]: E0209 20:33:35.017260 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.017290 kubelet[2767]: W0209 20:33:35.017287 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.017642 kubelet[2767]: E0209 20:33:35.017329 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.017970 kubelet[2767]: E0209 20:33:35.017897 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:35.017970 kubelet[2767]: W0209 20:33:35.017941 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:35.018190 kubelet[2767]: E0209 20:33:35.017986 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:35.884384 kubelet[2767]: E0209 20:33:35.884286 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:36.015326 kubelet[2767]: E0209 20:33:36.015257 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.015326 kubelet[2767]: W0209 20:33:36.015311 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.016733 kubelet[2767]: E0209 20:33:36.015406 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.016733 kubelet[2767]: E0209 20:33:36.015970 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.016733 kubelet[2767]: W0209 20:33:36.016006 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.016733 kubelet[2767]: E0209 20:33:36.016063 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.016733 kubelet[2767]: E0209 20:33:36.016627 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.016733 kubelet[2767]: W0209 20:33:36.016664 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.016733 kubelet[2767]: E0209 20:33:36.016716 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.017818 kubelet[2767]: E0209 20:33:36.017370 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.017818 kubelet[2767]: W0209 20:33:36.017407 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.017818 kubelet[2767]: E0209 20:33:36.017462 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.018292 kubelet[2767]: E0209 20:33:36.017988 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.018292 kubelet[2767]: W0209 20:33:36.018022 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.018292 kubelet[2767]: E0209 20:33:36.018077 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.018817 kubelet[2767]: E0209 20:33:36.018608 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.018817 kubelet[2767]: W0209 20:33:36.018641 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.018817 kubelet[2767]: E0209 20:33:36.018691 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.019365 kubelet[2767]: E0209 20:33:36.019307 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.019365 kubelet[2767]: W0209 20:33:36.019354 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.019722 kubelet[2767]: E0209 20:33:36.019409 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.019943 kubelet[2767]: E0209 20:33:36.019907 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.019943 kubelet[2767]: W0209 20:33:36.019937 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.020274 kubelet[2767]: E0209 20:33:36.019985 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.020514 kubelet[2767]: E0209 20:33:36.020476 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.020514 kubelet[2767]: W0209 20:33:36.020506 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.020878 kubelet[2767]: E0209 20:33:36.020553 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.021103 kubelet[2767]: E0209 20:33:36.021066 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.021103 kubelet[2767]: W0209 20:33:36.021098 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.021464 kubelet[2767]: E0209 20:33:36.021148 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.021704 kubelet[2767]: E0209 20:33:36.021666 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.021704 kubelet[2767]: W0209 20:33:36.021697 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.022059 kubelet[2767]: E0209 20:33:36.021747 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.022281 kubelet[2767]: E0209 20:33:36.022247 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.022492 kubelet[2767]: W0209 20:33:36.022278 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.022492 kubelet[2767]: E0209 20:33:36.022329 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.023091 kubelet[2767]: E0209 20:33:36.023057 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.023091 kubelet[2767]: W0209 20:33:36.023089 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.023515 kubelet[2767]: E0209 20:33:36.023141 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.023784 kubelet[2767]: E0209 20:33:36.023749 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.023784 kubelet[2767]: W0209 20:33:36.023781 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.024113 kubelet[2767]: E0209 20:33:36.023841 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.024489 kubelet[2767]: E0209 20:33:36.024429 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.024489 kubelet[2767]: W0209 20:33:36.024464 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.024792 kubelet[2767]: E0209 20:33:36.024517 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.025070 kubelet[2767]: E0209 20:33:36.025010 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.025070 kubelet[2767]: W0209 20:33:36.025045 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.025316 kubelet[2767]: E0209 20:33:36.025094 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.025675 kubelet[2767]: E0209 20:33:36.025641 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.025834 kubelet[2767]: W0209 20:33:36.025677 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.025957 kubelet[2767]: E0209 20:33:36.025814 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.026189 kubelet[2767]: E0209 20:33:36.026161 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.026317 kubelet[2767]: W0209 20:33:36.026189 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.026317 kubelet[2767]: E0209 20:33:36.026260 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.026800 kubelet[2767]: E0209 20:33:36.026710 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.026800 kubelet[2767]: W0209 20:33:36.026745 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.027142 kubelet[2767]: E0209 20:33:36.026908 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.027258 kubelet[2767]: E0209 20:33:36.027227 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.027258 kubelet[2767]: W0209 20:33:36.027253 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.027510 kubelet[2767]: E0209 20:33:36.027295 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.027933 kubelet[2767]: E0209 20:33:36.027898 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.028128 kubelet[2767]: W0209 20:33:36.027934 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.028128 kubelet[2767]: E0209 20:33:36.027983 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.028481 kubelet[2767]: E0209 20:33:36.028455 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.028622 kubelet[2767]: W0209 20:33:36.028481 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.028760 kubelet[2767]: E0209 20:33:36.028610 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.029018 kubelet[2767]: E0209 20:33:36.028985 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.029157 kubelet[2767]: W0209 20:33:36.029020 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.029283 kubelet[2767]: E0209 20:33:36.029154 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.029516 kubelet[2767]: E0209 20:33:36.029477 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.029516 kubelet[2767]: W0209 20:33:36.029502 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.029842 kubelet[2767]: E0209 20:33:36.029544 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.030132 kubelet[2767]: E0209 20:33:36.030100 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.030296 kubelet[2767]: W0209 20:33:36.030134 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.030296 kubelet[2767]: E0209 20:33:36.030181 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.030835 kubelet[2767]: E0209 20:33:36.030777 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.030835 kubelet[2767]: W0209 20:33:36.030811 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.031087 kubelet[2767]: E0209 20:33:36.030938 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.031312 kubelet[2767]: E0209 20:33:36.031287 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.031519 kubelet[2767]: W0209 20:33:36.031313 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.031519 kubelet[2767]: E0209 20:33:36.031386 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.031911 kubelet[2767]: E0209 20:33:36.031864 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.031911 kubelet[2767]: W0209 20:33:36.031901 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.032163 kubelet[2767]: E0209 20:33:36.031948 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.032465 kubelet[2767]: E0209 20:33:36.032427 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.032465 kubelet[2767]: W0209 20:33:36.032452 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.032795 kubelet[2767]: E0209 20:33:36.032486 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:36.033302 kubelet[2767]: E0209 20:33:36.033267 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 9 20:33:36.033447 kubelet[2767]: W0209 20:33:36.033304 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 9 20:33:36.033447 kubelet[2767]: E0209 20:33:36.033365 2767 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 9 20:33:37.883971 kubelet[2767]: E0209 20:33:37.883900 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:39.254564 env[1563]: time="2024-02-09T20:33:39.254512427Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:39.255148 env[1563]: time="2024-02-09T20:33:39.255107923Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:39.256518 env[1563]: time="2024-02-09T20:33:39.256475960Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:39.257461 env[1563]: time="2024-02-09T20:33:39.257416939Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:b05edbd1f80db4ada229e6001a666a7dd36bb6ab617143684fb3d28abfc4b71e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:39.258444 env[1563]: time="2024-02-09T20:33:39.258385208Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\" returns image reference \"sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a\"" Feb 9 20:33:39.259437 env[1563]: time="2024-02-09T20:33:39.259422901Z" level=info msg="CreateContainer within sandbox \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 9 20:33:39.264052 env[1563]: time="2024-02-09T20:33:39.264007820Z" level=info msg="CreateContainer within sandbox \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741\"" Feb 9 20:33:39.264315 env[1563]: time="2024-02-09T20:33:39.264279531Z" level=info msg="StartContainer for \"f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741\"" Feb 9 20:33:39.320892 env[1563]: time="2024-02-09T20:33:39.320856629Z" level=info msg="StartContainer for \"f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741\" returns successfully" Feb 9 20:33:39.363147 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741-rootfs.mount: Deactivated successfully. Feb 9 20:33:39.410285 env[1563]: time="2024-02-09T20:33:39.410154967Z" level=info msg="shim disconnected" id=f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741 Feb 9 20:33:39.410285 env[1563]: time="2024-02-09T20:33:39.410254406Z" level=warning msg="cleaning up after shim disconnected" id=f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741 namespace=k8s.io Feb 9 20:33:39.410285 env[1563]: time="2024-02-09T20:33:39.410283822Z" level=info msg="cleaning up dead shim" Feb 9 20:33:39.438872 env[1563]: time="2024-02-09T20:33:39.438755739Z" level=warning msg="cleanup warnings time=\"2024-02-09T20:33:39Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3988 runtime=io.containerd.runc.v2\n" Feb 9 20:33:39.883821 kubelet[2767]: E0209 20:33:39.883746 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:39.997550 env[1563]: time="2024-02-09T20:33:39.997461465Z" level=info msg="StopPodSandbox for \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\"" Feb 9 20:33:39.997810 env[1563]: time="2024-02-09T20:33:39.997614022Z" level=info msg="Container to stop \"f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Feb 9 20:33:40.027210 env[1563]: time="2024-02-09T20:33:40.027156329Z" level=info msg="shim disconnected" id=065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215 Feb 9 20:33:40.027210 env[1563]: time="2024-02-09T20:33:40.027211756Z" level=warning msg="cleaning up after shim disconnected" id=065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215 namespace=k8s.io Feb 9 20:33:40.027390 env[1563]: time="2024-02-09T20:33:40.027220273Z" level=info msg="cleaning up dead shim" Feb 9 20:33:40.031628 env[1563]: time="2024-02-09T20:33:40.031609289Z" level=warning msg="cleanup warnings time=\"2024-02-09T20:33:40Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=4025 runtime=io.containerd.runc.v2\n" Feb 9 20:33:40.031801 env[1563]: time="2024-02-09T20:33:40.031785936Z" level=info msg="TearDown network for sandbox \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\" successfully" Feb 9 20:33:40.031835 env[1563]: time="2024-02-09T20:33:40.031800710Z" level=info msg="StopPodSandbox for \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\" returns successfully" Feb 9 20:33:40.158335 kubelet[2767]: I0209 20:33:40.158131 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-xtables-lock\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.158335 kubelet[2767]: I0209 20:33:40.158230 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-var-lib-calico\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.158335 kubelet[2767]: I0209 20:33:40.158287 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-bin-dir\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.158335 kubelet[2767]: I0209 20:33:40.158291 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.158335 kubelet[2767]: I0209 20:33:40.158365 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-policysync\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.159114 kubelet[2767]: I0209 20:33:40.158385 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.159114 kubelet[2767]: I0209 20:33:40.158464 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-lib-modules\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.159114 kubelet[2767]: I0209 20:33:40.158516 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.159114 kubelet[2767]: I0209 20:33:40.158460 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.159114 kubelet[2767]: I0209 20:33:40.158546 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94drh\" (UniqueName: \"kubernetes.io/projected/a03330e2-7655-4766-b4a3-8964354a083e-kube-api-access-94drh\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.159702 kubelet[2767]: I0209 20:33:40.158501 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-policysync" (OuterVolumeSpecName: "policysync") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.159702 kubelet[2767]: I0209 20:33:40.158706 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-net-dir\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.159702 kubelet[2767]: I0209 20:33:40.158803 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-log-dir\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.159702 kubelet[2767]: I0209 20:33:40.158792 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.159702 kubelet[2767]: I0209 20:33:40.158875 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.160233 kubelet[2767]: I0209 20:33:40.158897 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-flexvol-driver-host\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.160233 kubelet[2767]: I0209 20:33:40.158944 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.160233 kubelet[2767]: I0209 20:33:40.159040 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-var-run-calico\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.160233 kubelet[2767]: I0209 20:33:40.159126 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03330e2-7655-4766-b4a3-8964354a083e-tigera-ca-bundle\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.160233 kubelet[2767]: I0209 20:33:40.159122 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 9 20:33:40.160800 kubelet[2767]: I0209 20:33:40.159195 2767 reconciler_common.go:169] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a03330e2-7655-4766-b4a3-8964354a083e-node-certs\") pod \"a03330e2-7655-4766-b4a3-8964354a083e\" (UID: \"a03330e2-7655-4766-b4a3-8964354a083e\") " Feb 9 20:33:40.160800 kubelet[2767]: I0209 20:33:40.159316 2767 reconciler_common.go:295] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-lib-modules\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.160800 kubelet[2767]: I0209 20:33:40.159372 2767 reconciler_common.go:295] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-policysync\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.160800 kubelet[2767]: I0209 20:33:40.159404 2767 reconciler_common.go:295] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-net-dir\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.160800 kubelet[2767]: I0209 20:33:40.159433 2767 reconciler_common.go:295] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-log-dir\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.160800 kubelet[2767]: I0209 20:33:40.159467 2767 reconciler_common.go:295] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-flexvol-driver-host\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.160800 kubelet[2767]: I0209 20:33:40.159501 2767 reconciler_common.go:295] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-var-run-calico\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.160800 kubelet[2767]: I0209 20:33:40.159533 2767 reconciler_common.go:295] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-xtables-lock\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.161657 kubelet[2767]: I0209 20:33:40.159563 2767 reconciler_common.go:295] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-var-lib-calico\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.161657 kubelet[2767]: W0209 20:33:40.159539 2767 empty_dir.go:525] Warning: Failed to clear quota on /var/lib/kubelet/pods/a03330e2-7655-4766-b4a3-8964354a083e/volumes/kubernetes.io~configmap/tigera-ca-bundle: clearQuota called, but quotas disabled Feb 9 20:33:40.161657 kubelet[2767]: I0209 20:33:40.159592 2767 reconciler_common.go:295] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a03330e2-7655-4766-b4a3-8964354a083e-cni-bin-dir\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.161657 kubelet[2767]: I0209 20:33:40.160099 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a03330e2-7655-4766-b4a3-8964354a083e-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 9 20:33:40.164846 kubelet[2767]: I0209 20:33:40.164808 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a03330e2-7655-4766-b4a3-8964354a083e-kube-api-access-94drh" (OuterVolumeSpecName: "kube-api-access-94drh") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "kube-api-access-94drh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 9 20:33:40.164939 kubelet[2767]: I0209 20:33:40.164895 2767 operation_generator.go:900] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03330e2-7655-4766-b4a3-8964354a083e-node-certs" (OuterVolumeSpecName: "node-certs") pod "a03330e2-7655-4766-b4a3-8964354a083e" (UID: "a03330e2-7655-4766-b4a3-8964354a083e"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 9 20:33:40.260725 kubelet[2767]: I0209 20:33:40.260560 2767 reconciler_common.go:295] "Volume detached for volume \"kube-api-access-94drh\" (UniqueName: \"kubernetes.io/projected/a03330e2-7655-4766-b4a3-8964354a083e-kube-api-access-94drh\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.260725 kubelet[2767]: I0209 20:33:40.260699 2767 reconciler_common.go:295] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03330e2-7655-4766-b4a3-8964354a083e-tigera-ca-bundle\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.260725 kubelet[2767]: I0209 20:33:40.260737 2767 reconciler_common.go:295] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a03330e2-7655-4766-b4a3-8964354a083e-node-certs\") on node \"ci-3510.3.2-a-45f40c263c\" DevicePath \"\"" Feb 9 20:33:40.267942 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215-rootfs.mount: Deactivated successfully. Feb 9 20:33:40.268039 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215-shm.mount: Deactivated successfully. Feb 9 20:33:40.268116 systemd[1]: var-lib-kubelet-pods-a03330e2\x2d7655\x2d4766\x2db4a3\x2d8964354a083e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d94drh.mount: Deactivated successfully. Feb 9 20:33:40.268191 systemd[1]: var-lib-kubelet-pods-a03330e2\x2d7655\x2d4766\x2db4a3\x2d8964354a083e-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Feb 9 20:33:41.003408 kubelet[2767]: I0209 20:33:41.003332 2767 scope.go:115] "RemoveContainer" containerID="f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741" Feb 9 20:33:41.006032 env[1563]: time="2024-02-09T20:33:41.005959717Z" level=info msg="RemoveContainer for \"f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741\"" Feb 9 20:33:41.022393 env[1563]: time="2024-02-09T20:33:41.022263069Z" level=info msg="RemoveContainer for \"f79bc26b8e2b9df395503d2dede040a7b0d47788366406bc2766b29d00b4d741\" returns successfully" Feb 9 20:33:41.029904 kubelet[2767]: I0209 20:33:41.029881 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:41.030014 kubelet[2767]: E0209 20:33:41.029923 2767 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="a03330e2-7655-4766-b4a3-8964354a083e" containerName="flexvol-driver" Feb 9 20:33:41.030014 kubelet[2767]: I0209 20:33:41.029942 2767 memory_manager.go:346] "RemoveStaleState removing state" podUID="a03330e2-7655-4766-b4a3-8964354a083e" containerName="flexvol-driver" Feb 9 20:33:41.067245 kubelet[2767]: I0209 20:33:41.067194 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-xtables-lock\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067381 kubelet[2767]: I0209 20:33:41.067275 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad6450f1-d22f-43d2-9291-94e75a6d1af4-tigera-ca-bundle\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067381 kubelet[2767]: I0209 20:33:41.067303 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-var-run-calico\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067381 kubelet[2767]: I0209 20:33:41.067322 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z5g5\" (UniqueName: \"kubernetes.io/projected/ad6450f1-d22f-43d2-9291-94e75a6d1af4-kube-api-access-5z5g5\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067483 kubelet[2767]: I0209 20:33:41.067381 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ad6450f1-d22f-43d2-9291-94e75a6d1af4-node-certs\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067483 kubelet[2767]: I0209 20:33:41.067443 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-cni-bin-dir\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067483 kubelet[2767]: I0209 20:33:41.067482 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-cni-net-dir\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067567 kubelet[2767]: I0209 20:33:41.067513 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-lib-modules\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067567 kubelet[2767]: I0209 20:33:41.067543 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-policysync\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067651 kubelet[2767]: I0209 20:33:41.067570 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-flexvol-driver-host\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067651 kubelet[2767]: I0209 20:33:41.067632 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-var-lib-calico\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.067651 kubelet[2767]: I0209 20:33:41.067650 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ad6450f1-d22f-43d2-9291-94e75a6d1af4-cni-log-dir\") pod \"calico-node-9ljtv\" (UID: \"ad6450f1-d22f-43d2-9291-94e75a6d1af4\") " pod="calico-system/calico-node-9ljtv" Feb 9 20:33:41.333690 env[1563]: time="2024-02-09T20:33:41.333551231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9ljtv,Uid:ad6450f1-d22f-43d2-9291-94e75a6d1af4,Namespace:calico-system,Attempt:0,}" Feb 9 20:33:41.349312 env[1563]: time="2024-02-09T20:33:41.349276776Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 9 20:33:41.349312 env[1563]: time="2024-02-09T20:33:41.349299349Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 9 20:33:41.349312 env[1563]: time="2024-02-09T20:33:41.349306337Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 9 20:33:41.349415 env[1563]: time="2024-02-09T20:33:41.349386692Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/7b272b09d4cd745926406ac3c0faf831dc1f7ebea63cd5efa41d8fae567dcd87 pid=4053 runtime=io.containerd.runc.v2 Feb 9 20:33:41.392036 env[1563]: time="2024-02-09T20:33:41.392003047Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9ljtv,Uid:ad6450f1-d22f-43d2-9291-94e75a6d1af4,Namespace:calico-system,Attempt:0,} returns sandbox id \"7b272b09d4cd745926406ac3c0faf831dc1f7ebea63cd5efa41d8fae567dcd87\"" Feb 9 20:33:41.393624 env[1563]: time="2024-02-09T20:33:41.393599322Z" level=info msg="CreateContainer within sandbox \"7b272b09d4cd745926406ac3c0faf831dc1f7ebea63cd5efa41d8fae567dcd87\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 9 20:33:41.406714 env[1563]: time="2024-02-09T20:33:41.406654273Z" level=info msg="CreateContainer within sandbox \"7b272b09d4cd745926406ac3c0faf831dc1f7ebea63cd5efa41d8fae567dcd87\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"72bedd678c3d08c85d7d8ab8f749454a3e24b21f567239d91f644c8015fad149\"" Feb 9 20:33:41.407012 env[1563]: time="2024-02-09T20:33:41.406984362Z" level=info msg="StartContainer for \"72bedd678c3d08c85d7d8ab8f749454a3e24b21f567239d91f644c8015fad149\"" Feb 9 20:33:41.513686 env[1563]: time="2024-02-09T20:33:41.513614058Z" level=info msg="StartContainer for \"72bedd678c3d08c85d7d8ab8f749454a3e24b21f567239d91f644c8015fad149\" returns successfully" Feb 9 20:33:41.562718 env[1563]: time="2024-02-09T20:33:41.562631358Z" level=info msg="shim disconnected" id=72bedd678c3d08c85d7d8ab8f749454a3e24b21f567239d91f644c8015fad149 Feb 9 20:33:41.562718 env[1563]: time="2024-02-09T20:33:41.562717079Z" level=warning msg="cleaning up after shim disconnected" id=72bedd678c3d08c85d7d8ab8f749454a3e24b21f567239d91f644c8015fad149 namespace=k8s.io Feb 9 20:33:41.563123 env[1563]: time="2024-02-09T20:33:41.562739971Z" level=info msg="cleaning up dead shim" Feb 9 20:33:41.589971 env[1563]: time="2024-02-09T20:33:41.589788161Z" level=warning msg="cleanup warnings time=\"2024-02-09T20:33:41Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=4131 runtime=io.containerd.runc.v2\n" Feb 9 20:33:41.884541 kubelet[2767]: E0209 20:33:41.884442 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:42.013078 env[1563]: time="2024-02-09T20:33:42.012980759Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\"" Feb 9 20:33:42.885674 kubelet[2767]: I0209 20:33:42.885656 2767 kubelet_volumes.go:160] "Cleaned up orphaned pod volumes dir" podUID=a03330e2-7655-4766-b4a3-8964354a083e path="/var/lib/kubelet/pods/a03330e2-7655-4766-b4a3-8964354a083e/volumes" Feb 9 20:33:43.883827 kubelet[2767]: E0209 20:33:43.883723 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:45.397819 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1002840716.mount: Deactivated successfully. Feb 9 20:33:45.884229 kubelet[2767]: E0209 20:33:45.884122 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:47.884734 kubelet[2767]: E0209 20:33:47.884667 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:49.884219 kubelet[2767]: E0209 20:33:49.884112 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:51.884536 kubelet[2767]: E0209 20:33:51.884485 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:53.884471 kubelet[2767]: E0209 20:33:53.884406 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:55.884316 kubelet[2767]: E0209 20:33:55.884214 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:57.425773 env[1563]: time="2024-02-09T20:33:57.425748505Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:57.426272 env[1563]: time="2024-02-09T20:33:57.426260886Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:57.427239 env[1563]: time="2024-02-09T20:33:57.427228599Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:57.428158 env[1563]: time="2024-02-09T20:33:57.428111228Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:d943b4c23e82a39b0186a1a3b2fe8f728e543d503df72d7be521501a82b7e7b4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 9 20:33:57.429861 env[1563]: time="2024-02-09T20:33:57.429842802Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\" returns image reference \"sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93\"" Feb 9 20:33:57.431016 env[1563]: time="2024-02-09T20:33:57.430963003Z" level=info msg="CreateContainer within sandbox \"7b272b09d4cd745926406ac3c0faf831dc1f7ebea63cd5efa41d8fae567dcd87\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 9 20:33:57.435622 env[1563]: time="2024-02-09T20:33:57.435579668Z" level=info msg="CreateContainer within sandbox \"7b272b09d4cd745926406ac3c0faf831dc1f7ebea63cd5efa41d8fae567dcd87\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"53cca5fa8fdfa22f654f21118fb2b4aea26ef8a2f0b16b47cc3ddd88b8791151\"" Feb 9 20:33:57.435905 env[1563]: time="2024-02-09T20:33:57.435853895Z" level=info msg="StartContainer for \"53cca5fa8fdfa22f654f21118fb2b4aea26ef8a2f0b16b47cc3ddd88b8791151\"" Feb 9 20:33:57.490971 env[1563]: time="2024-02-09T20:33:57.490875714Z" level=info msg="StartContainer for \"53cca5fa8fdfa22f654f21118fb2b4aea26ef8a2f0b16b47cc3ddd88b8791151\" returns successfully" Feb 9 20:33:57.884296 kubelet[2767]: E0209 20:33:57.884191 2767 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:58.306286 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-53cca5fa8fdfa22f654f21118fb2b4aea26ef8a2f0b16b47cc3ddd88b8791151-rootfs.mount: Deactivated successfully. Feb 9 20:33:58.374010 kubelet[2767]: I0209 20:33:58.373914 2767 kubelet_node_status.go:493] "Fast updating node status as it just became ready" Feb 9 20:33:58.391863 kubelet[2767]: I0209 20:33:58.391843 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:58.411945 kubelet[2767]: I0209 20:33:58.411888 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:58.413190 kubelet[2767]: I0209 20:33:58.413109 2767 topology_manager.go:210] "Topology Admit Handler" Feb 9 20:33:58.492884 kubelet[2767]: I0209 20:33:58.492782 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65f64fea-8fa3-417d-9fee-7bbdf36de2c6-config-volume\") pod \"coredns-787d4945fb-qgbqd\" (UID: \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\") " pod="kube-system/coredns-787d4945fb-qgbqd" Feb 9 20:33:58.492884 kubelet[2767]: I0209 20:33:58.492882 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5bd3826-3b40-4247-a328-b65f90095c86-config-volume\") pod \"coredns-787d4945fb-djg8j\" (UID: \"c5bd3826-3b40-4247-a328-b65f90095c86\") " pod="kube-system/coredns-787d4945fb-djg8j" Feb 9 20:33:58.493283 kubelet[2767]: I0209 20:33:58.493077 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d4f3e13-ea2f-4678-87aa-ee971f79a1cb-tigera-ca-bundle\") pod \"calico-kube-controllers-68576cfd85-6rqtz\" (UID: \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\") " pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" Feb 9 20:33:58.493842 kubelet[2767]: I0209 20:33:58.493758 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffl5\" (UniqueName: \"kubernetes.io/projected/c5bd3826-3b40-4247-a328-b65f90095c86-kube-api-access-kffl5\") pod \"coredns-787d4945fb-djg8j\" (UID: \"c5bd3826-3b40-4247-a328-b65f90095c86\") " pod="kube-system/coredns-787d4945fb-djg8j" Feb 9 20:33:58.497043 kubelet[2767]: I0209 20:33:58.494371 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8m9l\" (UniqueName: \"kubernetes.io/projected/0d4f3e13-ea2f-4678-87aa-ee971f79a1cb-kube-api-access-q8m9l\") pod \"calico-kube-controllers-68576cfd85-6rqtz\" (UID: \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\") " pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" Feb 9 20:33:58.497043 kubelet[2767]: I0209 20:33:58.494666 2767 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvz7j\" (UniqueName: \"kubernetes.io/projected/65f64fea-8fa3-417d-9fee-7bbdf36de2c6-kube-api-access-qvz7j\") pod \"coredns-787d4945fb-qgbqd\" (UID: \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\") " pod="kube-system/coredns-787d4945fb-qgbqd" Feb 9 20:33:58.695287 env[1563]: time="2024-02-09T20:33:58.695048310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-djg8j,Uid:c5bd3826-3b40-4247-a328-b65f90095c86,Namespace:kube-system,Attempt:0,}" Feb 9 20:33:58.716938 env[1563]: time="2024-02-09T20:33:58.716812812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-qgbqd,Uid:65f64fea-8fa3-417d-9fee-7bbdf36de2c6,Namespace:kube-system,Attempt:0,}" Feb 9 20:33:58.717193 env[1563]: time="2024-02-09T20:33:58.716914840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68576cfd85-6rqtz,Uid:0d4f3e13-ea2f-4678-87aa-ee971f79a1cb,Namespace:calico-system,Attempt:0,}" Feb 9 20:33:58.817124 env[1563]: time="2024-02-09T20:33:58.817021566Z" level=info msg="shim disconnected" id=53cca5fa8fdfa22f654f21118fb2b4aea26ef8a2f0b16b47cc3ddd88b8791151 Feb 9 20:33:58.817603 env[1563]: time="2024-02-09T20:33:58.817128475Z" level=warning msg="cleaning up after shim disconnected" id=53cca5fa8fdfa22f654f21118fb2b4aea26ef8a2f0b16b47cc3ddd88b8791151 namespace=k8s.io Feb 9 20:33:58.817603 env[1563]: time="2024-02-09T20:33:58.817170908Z" level=info msg="cleaning up dead shim" Feb 9 20:33:58.847446 env[1563]: time="2024-02-09T20:33:58.847381617Z" level=warning msg="cleanup warnings time=\"2024-02-09T20:33:58Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=4228 runtime=io.containerd.runc.v2\n" Feb 9 20:33:58.882652 env[1563]: time="2024-02-09T20:33:58.882604073Z" level=error msg="Failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.882777 env[1563]: time="2024-02-09T20:33:58.882607876Z" level=error msg="Failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.882904 env[1563]: time="2024-02-09T20:33:58.882884513Z" level=error msg="encountered an error cleaning up failed sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.882904 env[1563]: time="2024-02-09T20:33:58.882895660Z" level=error msg="encountered an error cleaning up failed sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883004 env[1563]: time="2024-02-09T20:33:58.882918680Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-djg8j,Uid:c5bd3826-3b40-4247-a328-b65f90095c86,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883004 env[1563]: time="2024-02-09T20:33:58.882921119Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68576cfd85-6rqtz,Uid:0d4f3e13-ea2f-4678-87aa-ee971f79a1cb,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883122 kubelet[2767]: E0209 20:33:58.883105 2767 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883189 kubelet[2767]: E0209 20:33:58.883150 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-djg8j" Feb 9 20:33:58.883189 kubelet[2767]: E0209 20:33:58.883165 2767 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-djg8j" Feb 9 20:33:58.883189 kubelet[2767]: E0209 20:33:58.883104 2767 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883296 kubelet[2767]: E0209 20:33:58.883207 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-787d4945fb-djg8j_kube-system(c5bd3826-3b40-4247-a328-b65f90095c86)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-787d4945fb-djg8j_kube-system(c5bd3826-3b40-4247-a328-b65f90095c86)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:33:58.883296 kubelet[2767]: E0209 20:33:58.883211 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" Feb 9 20:33:58.883296 kubelet[2767]: E0209 20:33:58.883234 2767 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" Feb 9 20:33:58.883438 kubelet[2767]: E0209 20:33:58.883274 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-68576cfd85-6rqtz_calico-system(0d4f3e13-ea2f-4678-87aa-ee971f79a1cb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-68576cfd85-6rqtz_calico-system(0d4f3e13-ea2f-4678-87aa-ee971f79a1cb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:33:58.883562 env[1563]: time="2024-02-09T20:33:58.883535364Z" level=error msg="Failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883783 env[1563]: time="2024-02-09T20:33:58.883759764Z" level=error msg="encountered an error cleaning up failed sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883814 env[1563]: time="2024-02-09T20:33:58.883794389Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-qgbqd,Uid:65f64fea-8fa3-417d-9fee-7bbdf36de2c6,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883928 kubelet[2767]: E0209 20:33:58.883920 2767 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:58.883960 kubelet[2767]: E0209 20:33:58.883941 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-qgbqd" Feb 9 20:33:58.883960 kubelet[2767]: E0209 20:33:58.883958 2767 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-qgbqd" Feb 9 20:33:58.884010 kubelet[2767]: E0209 20:33:58.883981 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-787d4945fb-qgbqd_kube-system(65f64fea-8fa3-417d-9fee-7bbdf36de2c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-787d4945fb-qgbqd_kube-system(65f64fea-8fa3-417d-9fee-7bbdf36de2c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:33:58.884368 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5-shm.mount: Deactivated successfully. Feb 9 20:33:58.884463 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4-shm.mount: Deactivated successfully. Feb 9 20:33:59.064932 kubelet[2767]: I0209 20:33:59.064878 2767 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:33:59.066027 env[1563]: time="2024-02-09T20:33:59.065927383Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:33:59.071221 env[1563]: time="2024-02-09T20:33:59.071135874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.27.0\"" Feb 9 20:33:59.071526 kubelet[2767]: I0209 20:33:59.071325 2767 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:33:59.072535 env[1563]: time="2024-02-09T20:33:59.072438360Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:33:59.073640 kubelet[2767]: I0209 20:33:59.073557 2767 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:33:59.074798 env[1563]: time="2024-02-09T20:33:59.074674996Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:33:59.117911 env[1563]: time="2024-02-09T20:33:59.117851152Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:59.118088 env[1563]: time="2024-02-09T20:33:59.118054197Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:59.118164 kubelet[2767]: E0209 20:33:59.118089 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:33:59.118164 kubelet[2767]: E0209 20:33:59.118134 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:33:59.118291 kubelet[2767]: E0209 20:33:59.118172 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:33:59.118291 kubelet[2767]: E0209 20:33:59.118192 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:33:59.118291 kubelet[2767]: E0209 20:33:59.118204 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:33:59.118291 kubelet[2767]: E0209 20:33:59.118224 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:33:59.118551 kubelet[2767]: E0209 20:33:59.118263 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:33:59.118551 kubelet[2767]: E0209 20:33:59.118298 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:33:59.118673 env[1563]: time="2024-02-09T20:33:59.118459794Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:59.118717 kubelet[2767]: E0209 20:33:59.118576 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:33:59.118717 kubelet[2767]: E0209 20:33:59.118600 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:33:59.118717 kubelet[2767]: E0209 20:33:59.118633 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:33:59.118717 kubelet[2767]: E0209 20:33:59.118661 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:33:59.621721 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e-shm.mount: Deactivated successfully. Feb 9 20:33:59.889771 env[1563]: time="2024-02-09T20:33:59.889690516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dx2ql,Uid:21aae8c4-8c7c-48d6-86a1-b78761bdb569,Namespace:calico-system,Attempt:0,}" Feb 9 20:33:59.917683 env[1563]: time="2024-02-09T20:33:59.917617398Z" level=error msg="Failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:59.917904 env[1563]: time="2024-02-09T20:33:59.917855691Z" level=error msg="encountered an error cleaning up failed sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:59.917904 env[1563]: time="2024-02-09T20:33:59.917889060Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dx2ql,Uid:21aae8c4-8c7c-48d6-86a1-b78761bdb569,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:59.918093 kubelet[2767]: E0209 20:33:59.918052 2767 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:33:59.918133 kubelet[2767]: E0209 20:33:59.918103 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dx2ql" Feb 9 20:33:59.918133 kubelet[2767]: E0209 20:33:59.918131 2767 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dx2ql" Feb 9 20:33:59.918194 kubelet[2767]: E0209 20:33:59.918187 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dx2ql_calico-system(21aae8c4-8c7c-48d6-86a1-b78761bdb569)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dx2ql_calico-system(21aae8c4-8c7c-48d6-86a1-b78761bdb569)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:33:59.919556 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1-shm.mount: Deactivated successfully. Feb 9 20:34:00.079173 kubelet[2767]: I0209 20:34:00.079081 2767 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:34:00.080243 env[1563]: time="2024-02-09T20:34:00.080145280Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:34:00.129698 env[1563]: time="2024-02-09T20:34:00.129606243Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:00.129855 kubelet[2767]: E0209 20:34:00.129836 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:34:00.129926 kubelet[2767]: E0209 20:34:00.129878 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:34:00.129926 kubelet[2767]: E0209 20:34:00.129918 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:00.130057 kubelet[2767]: E0209 20:34:00.129949 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:34:02.868224 env[1563]: time="2024-02-09T20:34:02.868184565Z" level=info msg="StopPodSandbox for \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\"" Feb 9 20:34:02.868531 env[1563]: time="2024-02-09T20:34:02.868249199Z" level=info msg="TearDown network for sandbox \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\" successfully" Feb 9 20:34:02.868531 env[1563]: time="2024-02-09T20:34:02.868271594Z" level=info msg="StopPodSandbox for \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\" returns successfully" Feb 9 20:34:02.868531 env[1563]: time="2024-02-09T20:34:02.868461478Z" level=info msg="RemovePodSandbox for \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\"" Feb 9 20:34:02.868531 env[1563]: time="2024-02-09T20:34:02.868475470Z" level=info msg="Forcibly stopping sandbox \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\"" Feb 9 20:34:02.868531 env[1563]: time="2024-02-09T20:34:02.868509821Z" level=info msg="TearDown network for sandbox \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\" successfully" Feb 9 20:34:02.869759 env[1563]: time="2024-02-09T20:34:02.869747273Z" level=info msg="RemovePodSandbox \"400c7caed17c29ec6c9d8480d9388563ccbff391b0df4fd0ef792e1c4ebac91f\" returns successfully" Feb 9 20:34:02.870051 env[1563]: time="2024-02-09T20:34:02.870013969Z" level=info msg="StopPodSandbox for \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\"" Feb 9 20:34:02.870117 env[1563]: time="2024-02-09T20:34:02.870045306Z" level=info msg="TearDown network for sandbox \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\" successfully" Feb 9 20:34:02.870117 env[1563]: time="2024-02-09T20:34:02.870084085Z" level=info msg="StopPodSandbox for \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\" returns successfully" Feb 9 20:34:02.870198 env[1563]: time="2024-02-09T20:34:02.870187386Z" level=info msg="RemovePodSandbox for \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\"" Feb 9 20:34:02.870227 env[1563]: time="2024-02-09T20:34:02.870199602Z" level=info msg="Forcibly stopping sandbox \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\"" Feb 9 20:34:02.870248 env[1563]: time="2024-02-09T20:34:02.870227279Z" level=info msg="TearDown network for sandbox \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\" successfully" Feb 9 20:34:02.871347 env[1563]: time="2024-02-09T20:34:02.871305637Z" level=info msg="RemovePodSandbox \"065f0ab880c22f2d020e2766cde9b3415d10e5606e1bc08e75e42d08a0bf0215\" returns successfully" Feb 9 20:34:10.885847 env[1563]: time="2024-02-09T20:34:10.885705491Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:34:10.915645 env[1563]: time="2024-02-09T20:34:10.915604243Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:10.915797 kubelet[2767]: E0209 20:34:10.915785 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:34:10.915983 kubelet[2767]: E0209 20:34:10.915828 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:34:10.915983 kubelet[2767]: E0209 20:34:10.915850 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:10.915983 kubelet[2767]: E0209 20:34:10.915867 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:34:11.885203 env[1563]: time="2024-02-09T20:34:11.885063333Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:34:11.885203 env[1563]: time="2024-02-09T20:34:11.885112890Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:34:11.885795 env[1563]: time="2024-02-09T20:34:11.885106046Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:34:11.903729 env[1563]: time="2024-02-09T20:34:11.903660749Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:11.904129 kubelet[2767]: E0209 20:34:11.903921 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:34:11.904129 kubelet[2767]: E0209 20:34:11.903998 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:34:11.904129 kubelet[2767]: E0209 20:34:11.904044 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:11.904129 kubelet[2767]: E0209 20:34:11.904089 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:34:11.907164 env[1563]: time="2024-02-09T20:34:11.907140018Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:11.907212 env[1563]: time="2024-02-09T20:34:11.907154417Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:11.907269 kubelet[2767]: E0209 20:34:11.907260 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:34:11.907298 kubelet[2767]: E0209 20:34:11.907263 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:34:11.907298 kubelet[2767]: E0209 20:34:11.907281 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:34:11.907298 kubelet[2767]: E0209 20:34:11.907285 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:34:11.907367 kubelet[2767]: E0209 20:34:11.907301 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:11.907367 kubelet[2767]: E0209 20:34:11.907305 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:11.907367 kubelet[2767]: E0209 20:34:11.907320 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:34:11.907472 kubelet[2767]: E0209 20:34:11.907322 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:34:22.886515 env[1563]: time="2024-02-09T20:34:22.886398686Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:34:22.933415 env[1563]: time="2024-02-09T20:34:22.933335244Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:22.933647 kubelet[2767]: E0209 20:34:22.933620 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:34:22.934103 kubelet[2767]: E0209 20:34:22.933676 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:34:22.934103 kubelet[2767]: E0209 20:34:22.933734 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:22.934103 kubelet[2767]: E0209 20:34:22.933782 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:34:23.885660 env[1563]: time="2024-02-09T20:34:23.885576475Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:34:23.941787 env[1563]: time="2024-02-09T20:34:23.941708049Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:23.942335 kubelet[2767]: E0209 20:34:23.942046 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:34:23.942335 kubelet[2767]: E0209 20:34:23.942097 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:34:23.942335 kubelet[2767]: E0209 20:34:23.942153 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:23.942335 kubelet[2767]: E0209 20:34:23.942194 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:34:24.885429 env[1563]: time="2024-02-09T20:34:24.885264322Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:34:24.938644 env[1563]: time="2024-02-09T20:34:24.938528409Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:24.938905 kubelet[2767]: E0209 20:34:24.938844 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:34:24.938905 kubelet[2767]: E0209 20:34:24.938894 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:34:24.939104 kubelet[2767]: E0209 20:34:24.938947 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:24.939104 kubelet[2767]: E0209 20:34:24.939010 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:34:26.885551 env[1563]: time="2024-02-09T20:34:26.885403294Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:34:26.912464 env[1563]: time="2024-02-09T20:34:26.912417668Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:26.912582 kubelet[2767]: E0209 20:34:26.912570 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:34:26.912767 kubelet[2767]: E0209 20:34:26.912596 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:34:26.912767 kubelet[2767]: E0209 20:34:26.912626 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:26.912767 kubelet[2767]: E0209 20:34:26.912650 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:34:35.885758 env[1563]: time="2024-02-09T20:34:35.885655458Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:34:35.914921 env[1563]: time="2024-02-09T20:34:35.914885575Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:35.915192 kubelet[2767]: E0209 20:34:35.915181 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:34:35.915392 kubelet[2767]: E0209 20:34:35.915208 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:34:35.915392 kubelet[2767]: E0209 20:34:35.915230 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:35.915392 kubelet[2767]: E0209 20:34:35.915248 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:34:37.884839 env[1563]: time="2024-02-09T20:34:37.884717724Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:34:37.885717 env[1563]: time="2024-02-09T20:34:37.884953062Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:34:37.907571 env[1563]: time="2024-02-09T20:34:37.907508311Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:37.907571 env[1563]: time="2024-02-09T20:34:37.907559090Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:37.907712 kubelet[2767]: E0209 20:34:37.907671 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:34:37.907712 kubelet[2767]: E0209 20:34:37.907687 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:34:37.907712 kubelet[2767]: E0209 20:34:37.907700 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:34:37.907712 kubelet[2767]: E0209 20:34:37.907702 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:34:37.907936 kubelet[2767]: E0209 20:34:37.907725 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:37.907936 kubelet[2767]: E0209 20:34:37.907725 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:37.907936 kubelet[2767]: E0209 20:34:37.907744 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:34:37.908042 kubelet[2767]: E0209 20:34:37.907744 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:34:39.885582 env[1563]: time="2024-02-09T20:34:39.885458411Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:34:39.911332 env[1563]: time="2024-02-09T20:34:39.911297436Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:39.911610 kubelet[2767]: E0209 20:34:39.911570 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:34:39.911610 kubelet[2767]: E0209 20:34:39.911598 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:34:39.911794 kubelet[2767]: E0209 20:34:39.911619 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:39.911794 kubelet[2767]: E0209 20:34:39.911636 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:34:48.885514 env[1563]: time="2024-02-09T20:34:48.885388890Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:34:48.911701 env[1563]: time="2024-02-09T20:34:48.911644221Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:48.911938 kubelet[2767]: E0209 20:34:48.911899 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:34:48.911938 kubelet[2767]: E0209 20:34:48.911928 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:34:48.912132 kubelet[2767]: E0209 20:34:48.911950 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:48.912132 kubelet[2767]: E0209 20:34:48.911969 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:34:50.885519 env[1563]: time="2024-02-09T20:34:50.885428846Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:34:50.920493 env[1563]: time="2024-02-09T20:34:50.920391278Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:50.920692 kubelet[2767]: E0209 20:34:50.920667 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:34:50.921127 kubelet[2767]: E0209 20:34:50.920719 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:34:50.921127 kubelet[2767]: E0209 20:34:50.920773 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:50.921127 kubelet[2767]: E0209 20:34:50.920815 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:34:51.885491 env[1563]: time="2024-02-09T20:34:51.885330958Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:34:51.936898 env[1563]: time="2024-02-09T20:34:51.936835592Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:51.937311 kubelet[2767]: E0209 20:34:51.937075 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:34:51.937311 kubelet[2767]: E0209 20:34:51.937124 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:34:51.937311 kubelet[2767]: E0209 20:34:51.937176 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:51.937311 kubelet[2767]: E0209 20:34:51.937217 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:34:53.885779 env[1563]: time="2024-02-09T20:34:53.885641270Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:34:53.944664 env[1563]: time="2024-02-09T20:34:53.944538717Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:34:53.945013 kubelet[2767]: E0209 20:34:53.944976 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:34:53.945819 kubelet[2767]: E0209 20:34:53.945070 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:34:53.945819 kubelet[2767]: E0209 20:34:53.945186 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:34:53.945819 kubelet[2767]: E0209 20:34:53.945278 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:35:02.884036 env[1563]: time="2024-02-09T20:35:02.883976154Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:35:02.898892 env[1563]: time="2024-02-09T20:35:02.898824897Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:02.899028 kubelet[2767]: E0209 20:35:02.898988 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:35:02.899028 kubelet[2767]: E0209 20:35:02.899021 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:35:02.899290 kubelet[2767]: E0209 20:35:02.899050 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:02.899290 kubelet[2767]: E0209 20:35:02.899074 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:35:03.885382 env[1563]: time="2024-02-09T20:35:03.885225460Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:35:03.912055 env[1563]: time="2024-02-09T20:35:03.912020224Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:03.912212 kubelet[2767]: E0209 20:35:03.912202 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:35:03.912416 kubelet[2767]: E0209 20:35:03.912229 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:35:03.912416 kubelet[2767]: E0209 20:35:03.912252 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:03.912416 kubelet[2767]: E0209 20:35:03.912269 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:35:05.884867 env[1563]: time="2024-02-09T20:35:05.884738466Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:35:05.885948 env[1563]: time="2024-02-09T20:35:05.884855284Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:35:05.911779 env[1563]: time="2024-02-09T20:35:05.911745293Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:05.911892 env[1563]: time="2024-02-09T20:35:05.911743789Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:05.911927 kubelet[2767]: E0209 20:35:05.911915 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:35:05.912115 kubelet[2767]: E0209 20:35:05.911947 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:35:05.912115 kubelet[2767]: E0209 20:35:05.911970 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:05.912115 kubelet[2767]: E0209 20:35:05.911916 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:35:05.912115 kubelet[2767]: E0209 20:35:05.911989 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:35:05.912115 kubelet[2767]: E0209 20:35:05.912001 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:35:05.912252 kubelet[2767]: E0209 20:35:05.912020 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:05.912252 kubelet[2767]: E0209 20:35:05.912035 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:35:13.885713 env[1563]: time="2024-02-09T20:35:13.885502079Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:35:13.938956 env[1563]: time="2024-02-09T20:35:13.938845232Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:13.939194 kubelet[2767]: E0209 20:35:13.939163 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:35:13.939635 kubelet[2767]: E0209 20:35:13.939216 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:35:13.939635 kubelet[2767]: E0209 20:35:13.939273 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:13.939635 kubelet[2767]: E0209 20:35:13.939318 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:35:15.885771 env[1563]: time="2024-02-09T20:35:15.885647391Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:35:15.937148 env[1563]: time="2024-02-09T20:35:15.937071338Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:15.937408 kubelet[2767]: E0209 20:35:15.937347 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:35:15.937408 kubelet[2767]: E0209 20:35:15.937392 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:35:15.937828 kubelet[2767]: E0209 20:35:15.937434 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:15.937828 kubelet[2767]: E0209 20:35:15.937469 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:35:16.884994 env[1563]: time="2024-02-09T20:35:16.884896544Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:35:16.920550 env[1563]: time="2024-02-09T20:35:16.920515641Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:16.920876 kubelet[2767]: E0209 20:35:16.920769 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:35:16.920876 kubelet[2767]: E0209 20:35:16.920792 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:35:16.920876 kubelet[2767]: E0209 20:35:16.920813 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:16.920876 kubelet[2767]: E0209 20:35:16.920830 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:35:19.885660 env[1563]: time="2024-02-09T20:35:19.885547811Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:35:19.915457 env[1563]: time="2024-02-09T20:35:19.915419662Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:19.915590 kubelet[2767]: E0209 20:35:19.915581 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:35:19.915767 kubelet[2767]: E0209 20:35:19.915606 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:35:19.915767 kubelet[2767]: E0209 20:35:19.915627 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:19.915767 kubelet[2767]: E0209 20:35:19.915659 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:35:25.885009 env[1563]: time="2024-02-09T20:35:25.884907570Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:35:25.938736 env[1563]: time="2024-02-09T20:35:25.938656497Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:25.938967 kubelet[2767]: E0209 20:35:25.938944 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:35:25.939351 kubelet[2767]: E0209 20:35:25.938990 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:35:25.939351 kubelet[2767]: E0209 20:35:25.939036 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:25.939351 kubelet[2767]: E0209 20:35:25.939070 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:35:28.885783 env[1563]: time="2024-02-09T20:35:28.885650842Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:35:28.911805 env[1563]: time="2024-02-09T20:35:28.911769699Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:28.911943 kubelet[2767]: E0209 20:35:28.911931 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:35:28.912118 kubelet[2767]: E0209 20:35:28.911959 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:35:28.912118 kubelet[2767]: E0209 20:35:28.911982 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:28.912118 kubelet[2767]: E0209 20:35:28.912000 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:35:30.885128 env[1563]: time="2024-02-09T20:35:30.885036653Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:35:30.911052 env[1563]: time="2024-02-09T20:35:30.911018676Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:30.911196 kubelet[2767]: E0209 20:35:30.911183 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:35:30.911404 kubelet[2767]: E0209 20:35:30.911212 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:35:30.911404 kubelet[2767]: E0209 20:35:30.911233 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:30.911404 kubelet[2767]: E0209 20:35:30.911251 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:35:33.885602 env[1563]: time="2024-02-09T20:35:33.885512514Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:35:33.935418 env[1563]: time="2024-02-09T20:35:33.935254974Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:33.935620 kubelet[2767]: E0209 20:35:33.935595 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:35:33.936133 kubelet[2767]: E0209 20:35:33.935660 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:35:33.936133 kubelet[2767]: E0209 20:35:33.935747 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:33.936133 kubelet[2767]: E0209 20:35:33.935817 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:35:38.885553 env[1563]: time="2024-02-09T20:35:38.885419532Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:35:38.912211 env[1563]: time="2024-02-09T20:35:38.912146280Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:38.912355 kubelet[2767]: E0209 20:35:38.912325 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:35:38.912556 kubelet[2767]: E0209 20:35:38.912359 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:35:38.912556 kubelet[2767]: E0209 20:35:38.912406 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:38.912556 kubelet[2767]: E0209 20:35:38.912424 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:35:39.884966 env[1563]: time="2024-02-09T20:35:39.884865016Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:35:39.936837 env[1563]: time="2024-02-09T20:35:39.936725434Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:39.937331 kubelet[2767]: E0209 20:35:39.937030 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:35:39.937331 kubelet[2767]: E0209 20:35:39.937087 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:35:39.937331 kubelet[2767]: E0209 20:35:39.937158 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:39.937331 kubelet[2767]: E0209 20:35:39.937220 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:35:41.884700 env[1563]: time="2024-02-09T20:35:41.884595946Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:35:41.913963 env[1563]: time="2024-02-09T20:35:41.913924617Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:41.914134 kubelet[2767]: E0209 20:35:41.914123 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:35:41.914322 kubelet[2767]: E0209 20:35:41.914152 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:35:41.914322 kubelet[2767]: E0209 20:35:41.914182 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:41.914322 kubelet[2767]: E0209 20:35:41.914208 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:35:48.885452 env[1563]: time="2024-02-09T20:35:48.885333322Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:35:48.912752 env[1563]: time="2024-02-09T20:35:48.912713585Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:48.912943 kubelet[2767]: E0209 20:35:48.912899 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:35:48.912943 kubelet[2767]: E0209 20:35:48.912925 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:35:48.913153 kubelet[2767]: E0209 20:35:48.912947 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:48.913153 kubelet[2767]: E0209 20:35:48.912966 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:35:52.885747 env[1563]: time="2024-02-09T20:35:52.885635844Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:35:52.946689 env[1563]: time="2024-02-09T20:35:52.946566912Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:52.946934 kubelet[2767]: E0209 20:35:52.946904 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:35:52.947570 kubelet[2767]: E0209 20:35:52.946977 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:35:52.947570 kubelet[2767]: E0209 20:35:52.947052 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:52.947570 kubelet[2767]: E0209 20:35:52.947108 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:35:53.885225 env[1563]: time="2024-02-09T20:35:53.885133889Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:35:53.947165 env[1563]: time="2024-02-09T20:35:53.947032025Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:53.948065 kubelet[2767]: E0209 20:35:53.947535 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:35:53.948065 kubelet[2767]: E0209 20:35:53.947655 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:35:53.948065 kubelet[2767]: E0209 20:35:53.947829 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:53.948065 kubelet[2767]: E0209 20:35:53.947970 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:35:54.885232 env[1563]: time="2024-02-09T20:35:54.885150559Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:35:54.937583 env[1563]: time="2024-02-09T20:35:54.937498335Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:35:54.937819 kubelet[2767]: E0209 20:35:54.937759 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:35:54.937819 kubelet[2767]: E0209 20:35:54.937803 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:35:54.937961 kubelet[2767]: E0209 20:35:54.937848 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:35:54.937961 kubelet[2767]: E0209 20:35:54.937884 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:36:01.885272 env[1563]: time="2024-02-09T20:36:01.885182789Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:36:01.943998 env[1563]: time="2024-02-09T20:36:01.943841082Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:01.944358 kubelet[2767]: E0209 20:36:01.944299 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:36:01.945174 kubelet[2767]: E0209 20:36:01.944409 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:36:01.945174 kubelet[2767]: E0209 20:36:01.944509 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:01.945174 kubelet[2767]: E0209 20:36:01.944590 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:36:03.885255 env[1563]: time="2024-02-09T20:36:03.885166953Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:36:03.914915 env[1563]: time="2024-02-09T20:36:03.914850886Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:03.915079 kubelet[2767]: E0209 20:36:03.915033 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:36:03.915079 kubelet[2767]: E0209 20:36:03.915061 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:36:03.915290 kubelet[2767]: E0209 20:36:03.915083 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:03.915290 kubelet[2767]: E0209 20:36:03.915101 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:36:07.885030 env[1563]: time="2024-02-09T20:36:07.884900120Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:36:07.911064 env[1563]: time="2024-02-09T20:36:07.911003525Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:07.911224 kubelet[2767]: E0209 20:36:07.911213 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:36:07.911444 kubelet[2767]: E0209 20:36:07.911239 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:36:07.911444 kubelet[2767]: E0209 20:36:07.911260 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:07.911444 kubelet[2767]: E0209 20:36:07.911279 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:36:08.885033 env[1563]: time="2024-02-09T20:36:08.884908483Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:36:08.911851 env[1563]: time="2024-02-09T20:36:08.911815452Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:08.912050 kubelet[2767]: E0209 20:36:08.912040 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:36:08.912228 kubelet[2767]: E0209 20:36:08.912067 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:36:08.912228 kubelet[2767]: E0209 20:36:08.912104 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:08.912228 kubelet[2767]: E0209 20:36:08.912123 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:36:16.884702 env[1563]: time="2024-02-09T20:36:16.884599499Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:36:16.898491 env[1563]: time="2024-02-09T20:36:16.898428390Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:16.898645 kubelet[2767]: E0209 20:36:16.898598 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:36:16.898645 kubelet[2767]: E0209 20:36:16.898624 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:36:16.898645 kubelet[2767]: E0209 20:36:16.898648 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:16.898916 kubelet[2767]: E0209 20:36:16.898667 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:36:17.885747 env[1563]: time="2024-02-09T20:36:17.885609618Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:36:17.912578 env[1563]: time="2024-02-09T20:36:17.912514363Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:17.912784 kubelet[2767]: E0209 20:36:17.912728 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:36:17.912784 kubelet[2767]: E0209 20:36:17.912755 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:36:17.912784 kubelet[2767]: E0209 20:36:17.912777 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:17.913013 kubelet[2767]: E0209 20:36:17.912794 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:36:20.883891 env[1563]: time="2024-02-09T20:36:20.883850029Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:36:20.883891 env[1563]: time="2024-02-09T20:36:20.883878792Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:36:20.900162 env[1563]: time="2024-02-09T20:36:20.900095535Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:20.900316 env[1563]: time="2024-02-09T20:36:20.900177191Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:20.900354 kubelet[2767]: E0209 20:36:20.900301 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:36:20.900354 kubelet[2767]: E0209 20:36:20.900343 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:36:20.900354 kubelet[2767]: E0209 20:36:20.900301 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:36:20.900600 kubelet[2767]: E0209 20:36:20.900363 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:36:20.900600 kubelet[2767]: E0209 20:36:20.900379 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:20.900600 kubelet[2767]: E0209 20:36:20.900382 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:20.900600 kubelet[2767]: E0209 20:36:20.900403 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:36:20.900740 kubelet[2767]: E0209 20:36:20.900403 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:36:28.884792 env[1563]: time="2024-02-09T20:36:28.884710571Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:36:28.938319 env[1563]: time="2024-02-09T20:36:28.938244180Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:28.938605 kubelet[2767]: E0209 20:36:28.938543 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:36:28.938605 kubelet[2767]: E0209 20:36:28.938592 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:36:28.939136 kubelet[2767]: E0209 20:36:28.938643 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:28.939136 kubelet[2767]: E0209 20:36:28.938685 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:36:30.885597 env[1563]: time="2024-02-09T20:36:30.885494891Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:36:30.912211 env[1563]: time="2024-02-09T20:36:30.912154078Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:30.912455 kubelet[2767]: E0209 20:36:30.912419 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:36:30.912642 kubelet[2767]: E0209 20:36:30.912462 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:36:30.912642 kubelet[2767]: E0209 20:36:30.912498 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:30.912642 kubelet[2767]: E0209 20:36:30.912515 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:36:32.885843 env[1563]: time="2024-02-09T20:36:32.885721976Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:36:32.939491 env[1563]: time="2024-02-09T20:36:32.939386592Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:32.939749 kubelet[2767]: E0209 20:36:32.939691 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:36:32.939749 kubelet[2767]: E0209 20:36:32.939744 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:36:32.940253 kubelet[2767]: E0209 20:36:32.939800 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:32.940253 kubelet[2767]: E0209 20:36:32.939846 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:36:33.884671 env[1563]: time="2024-02-09T20:36:33.884538407Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:36:33.914208 env[1563]: time="2024-02-09T20:36:33.914149271Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:33.914517 kubelet[2767]: E0209 20:36:33.914409 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:36:33.914517 kubelet[2767]: E0209 20:36:33.914436 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:36:33.914517 kubelet[2767]: E0209 20:36:33.914457 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:33.914517 kubelet[2767]: E0209 20:36:33.914475 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:36:41.884824 env[1563]: time="2024-02-09T20:36:41.884646412Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:36:41.911168 env[1563]: time="2024-02-09T20:36:41.911107180Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:41.911303 kubelet[2767]: E0209 20:36:41.911286 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:36:41.911556 kubelet[2767]: E0209 20:36:41.911318 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:36:41.911556 kubelet[2767]: E0209 20:36:41.911373 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:41.911556 kubelet[2767]: E0209 20:36:41.911439 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:36:43.884792 env[1563]: time="2024-02-09T20:36:43.884704420Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:36:43.910838 env[1563]: time="2024-02-09T20:36:43.910765398Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:43.910967 kubelet[2767]: E0209 20:36:43.910952 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:36:43.911160 kubelet[2767]: E0209 20:36:43.910986 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:36:43.911160 kubelet[2767]: E0209 20:36:43.911020 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:43.911160 kubelet[2767]: E0209 20:36:43.911049 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:36:47.885335 env[1563]: time="2024-02-09T20:36:47.885239304Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:36:47.915112 env[1563]: time="2024-02-09T20:36:47.915047295Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:47.915290 kubelet[2767]: E0209 20:36:47.915277 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:36:47.915524 kubelet[2767]: E0209 20:36:47.915307 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:36:47.915524 kubelet[2767]: E0209 20:36:47.915329 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:47.915524 kubelet[2767]: E0209 20:36:47.915370 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:36:48.885260 env[1563]: time="2024-02-09T20:36:48.885159868Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:36:48.901124 env[1563]: time="2024-02-09T20:36:48.901085398Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:48.901374 kubelet[2767]: E0209 20:36:48.901226 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:36:48.901374 kubelet[2767]: E0209 20:36:48.901253 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:36:48.901374 kubelet[2767]: E0209 20:36:48.901275 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:48.901374 kubelet[2767]: E0209 20:36:48.901292 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:36:53.884642 env[1563]: time="2024-02-09T20:36:53.884531449Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:36:53.941125 env[1563]: time="2024-02-09T20:36:53.941000990Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:53.941356 kubelet[2767]: E0209 20:36:53.941319 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:36:53.941868 kubelet[2767]: E0209 20:36:53.941382 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:36:53.941868 kubelet[2767]: E0209 20:36:53.941453 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:53.941868 kubelet[2767]: E0209 20:36:53.941495 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:36:56.885465 env[1563]: time="2024-02-09T20:36:56.885331546Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:36:56.914703 env[1563]: time="2024-02-09T20:36:56.914625919Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:36:56.914836 kubelet[2767]: E0209 20:36:56.914786 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:36:56.914836 kubelet[2767]: E0209 20:36:56.914823 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:36:56.915029 kubelet[2767]: E0209 20:36:56.914844 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:36:56.915029 kubelet[2767]: E0209 20:36:56.914860 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:37:00.884853 env[1563]: time="2024-02-09T20:37:00.884703375Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:37:00.913944 env[1563]: time="2024-02-09T20:37:00.913888068Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:00.914204 kubelet[2767]: E0209 20:37:00.914193 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:37:00.914423 kubelet[2767]: E0209 20:37:00.914221 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:37:00.914423 kubelet[2767]: E0209 20:37:00.914244 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:00.914423 kubelet[2767]: E0209 20:37:00.914261 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:37:02.883802 env[1563]: time="2024-02-09T20:37:02.883765181Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:37:02.898384 env[1563]: time="2024-02-09T20:37:02.898311667Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:02.898525 kubelet[2767]: E0209 20:37:02.898497 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:37:02.898750 kubelet[2767]: E0209 20:37:02.898530 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:37:02.898750 kubelet[2767]: E0209 20:37:02.898561 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:02.898750 kubelet[2767]: E0209 20:37:02.898584 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:37:08.885408 env[1563]: time="2024-02-09T20:37:08.885278517Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:37:08.911692 env[1563]: time="2024-02-09T20:37:08.911654892Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:08.911878 kubelet[2767]: E0209 20:37:08.911837 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:37:08.911878 kubelet[2767]: E0209 20:37:08.911864 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:37:08.912079 kubelet[2767]: E0209 20:37:08.911885 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:08.912079 kubelet[2767]: E0209 20:37:08.911902 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:37:10.885058 env[1563]: time="2024-02-09T20:37:10.884919778Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:37:10.936300 env[1563]: time="2024-02-09T20:37:10.936241016Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:10.936508 kubelet[2767]: E0209 20:37:10.936484 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:37:10.936866 kubelet[2767]: E0209 20:37:10.936532 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:37:10.936866 kubelet[2767]: E0209 20:37:10.936581 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:10.936866 kubelet[2767]: E0209 20:37:10.936622 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:37:14.885117 env[1563]: time="2024-02-09T20:37:14.884983728Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:37:14.938065 env[1563]: time="2024-02-09T20:37:14.937978290Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:14.938258 kubelet[2767]: E0209 20:37:14.938239 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:37:14.938643 kubelet[2767]: E0209 20:37:14.938282 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:37:14.938643 kubelet[2767]: E0209 20:37:14.938368 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:14.938643 kubelet[2767]: E0209 20:37:14.938407 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:37:15.885243 env[1563]: time="2024-02-09T20:37:15.885128182Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:37:15.911429 env[1563]: time="2024-02-09T20:37:15.911385197Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:15.911609 kubelet[2767]: E0209 20:37:15.911573 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:37:15.911609 kubelet[2767]: E0209 20:37:15.911599 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:37:15.911680 kubelet[2767]: E0209 20:37:15.911621 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:15.911680 kubelet[2767]: E0209 20:37:15.911638 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:37:23.885281 env[1563]: time="2024-02-09T20:37:23.885169672Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:37:23.941387 env[1563]: time="2024-02-09T20:37:23.941305063Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:23.941695 kubelet[2767]: E0209 20:37:23.941634 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:37:23.941695 kubelet[2767]: E0209 20:37:23.941699 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:37:23.942215 kubelet[2767]: E0209 20:37:23.941788 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:23.942215 kubelet[2767]: E0209 20:37:23.941857 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:37:24.884851 env[1563]: time="2024-02-09T20:37:24.884727388Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:37:24.945904 env[1563]: time="2024-02-09T20:37:24.945781705Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:24.946826 kubelet[2767]: E0209 20:37:24.946293 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:37:24.946826 kubelet[2767]: E0209 20:37:24.946403 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:37:24.946826 kubelet[2767]: E0209 20:37:24.946507 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:24.946826 kubelet[2767]: E0209 20:37:24.946585 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:37:27.885382 env[1563]: time="2024-02-09T20:37:27.885279502Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:37:27.885382 env[1563]: time="2024-02-09T20:37:27.885304234Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:37:27.908124 env[1563]: time="2024-02-09T20:37:27.908083561Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:27.908124 env[1563]: time="2024-02-09T20:37:27.908100194Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:27.908285 kubelet[2767]: E0209 20:37:27.908272 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:37:27.908285 kubelet[2767]: E0209 20:37:27.908279 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:37:27.908502 kubelet[2767]: E0209 20:37:27.908307 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:37:27.908502 kubelet[2767]: E0209 20:37:27.908307 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:37:27.908502 kubelet[2767]: E0209 20:37:27.908330 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:27.908502 kubelet[2767]: E0209 20:37:27.908330 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:27.908502 kubelet[2767]: E0209 20:37:27.908355 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:37:27.908668 kubelet[2767]: E0209 20:37:27.908357 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:37:36.885373 env[1563]: time="2024-02-09T20:37:36.885230868Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:37:36.939397 env[1563]: time="2024-02-09T20:37:36.939327760Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:36.939619 kubelet[2767]: E0209 20:37:36.939595 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:37:36.939994 kubelet[2767]: E0209 20:37:36.939639 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:37:36.939994 kubelet[2767]: E0209 20:37:36.939684 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:36.939994 kubelet[2767]: E0209 20:37:36.939721 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:37:38.885014 env[1563]: time="2024-02-09T20:37:38.884901802Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:37:38.911119 env[1563]: time="2024-02-09T20:37:38.911058615Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:38.911268 kubelet[2767]: E0209 20:37:38.911256 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:37:38.911458 kubelet[2767]: E0209 20:37:38.911284 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:37:38.911458 kubelet[2767]: E0209 20:37:38.911307 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:38.911458 kubelet[2767]: E0209 20:37:38.911324 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:37:39.885375 env[1563]: time="2024-02-09T20:37:39.885278442Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:37:39.911004 env[1563]: time="2024-02-09T20:37:39.910970586Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:39.911218 kubelet[2767]: E0209 20:37:39.911182 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:37:39.911218 kubelet[2767]: E0209 20:37:39.911208 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:37:39.911275 kubelet[2767]: E0209 20:37:39.911228 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:39.911275 kubelet[2767]: E0209 20:37:39.911244 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:37:41.885565 env[1563]: time="2024-02-09T20:37:41.885470323Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:37:41.915458 env[1563]: time="2024-02-09T20:37:41.915344126Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:41.915615 kubelet[2767]: E0209 20:37:41.915587 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:37:41.915802 kubelet[2767]: E0209 20:37:41.915635 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:37:41.915802 kubelet[2767]: E0209 20:37:41.915665 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:41.915802 kubelet[2767]: E0209 20:37:41.915695 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:37:50.885054 env[1563]: time="2024-02-09T20:37:50.884949464Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:37:50.911043 env[1563]: time="2024-02-09T20:37:50.911003680Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:50.911273 kubelet[2767]: E0209 20:37:50.911262 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:37:50.911479 kubelet[2767]: E0209 20:37:50.911289 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:37:50.911479 kubelet[2767]: E0209 20:37:50.911313 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:50.911479 kubelet[2767]: E0209 20:37:50.911333 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:37:52.885594 env[1563]: time="2024-02-09T20:37:52.885474513Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:37:52.939792 env[1563]: time="2024-02-09T20:37:52.939686215Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:52.940049 kubelet[2767]: E0209 20:37:52.939992 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:37:52.940049 kubelet[2767]: E0209 20:37:52.940045 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:37:52.940590 kubelet[2767]: E0209 20:37:52.940100 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:52.940590 kubelet[2767]: E0209 20:37:52.940143 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:37:53.885727 env[1563]: time="2024-02-09T20:37:53.885587828Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:37:53.885727 env[1563]: time="2024-02-09T20:37:53.885649032Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:37:53.948453 env[1563]: time="2024-02-09T20:37:53.948298715Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:53.948993 kubelet[2767]: E0209 20:37:53.948902 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:37:53.948993 kubelet[2767]: E0209 20:37:53.948997 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:37:53.949953 kubelet[2767]: E0209 20:37:53.949103 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:53.949953 kubelet[2767]: E0209 20:37:53.949186 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:37:53.951240 env[1563]: time="2024-02-09T20:37:53.951119289Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:37:53.951606 kubelet[2767]: E0209 20:37:53.951552 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:37:53.951867 kubelet[2767]: E0209 20:37:53.951651 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:37:53.951867 kubelet[2767]: E0209 20:37:53.951815 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:37:53.952285 kubelet[2767]: E0209 20:37:53.951955 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:38:01.885233 env[1563]: time="2024-02-09T20:38:01.885110149Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:38:01.936903 env[1563]: time="2024-02-09T20:38:01.936792114Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:01.937072 kubelet[2767]: E0209 20:38:01.937050 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:38:01.937448 kubelet[2767]: E0209 20:38:01.937093 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:38:01.937448 kubelet[2767]: E0209 20:38:01.937136 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:01.937448 kubelet[2767]: E0209 20:38:01.937171 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:38:04.884926 env[1563]: time="2024-02-09T20:38:04.884831457Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:38:04.928948 env[1563]: time="2024-02-09T20:38:04.928879169Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:04.929083 kubelet[2767]: E0209 20:38:04.929069 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:38:04.929291 kubelet[2767]: E0209 20:38:04.929101 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:38:04.929291 kubelet[2767]: E0209 20:38:04.929134 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:04.929291 kubelet[2767]: E0209 20:38:04.929163 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:38:06.885562 env[1563]: time="2024-02-09T20:38:06.885463493Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:38:06.912739 env[1563]: time="2024-02-09T20:38:06.912676908Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:06.912930 kubelet[2767]: E0209 20:38:06.912903 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:38:06.913089 kubelet[2767]: E0209 20:38:06.912940 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:38:06.913089 kubelet[2767]: E0209 20:38:06.912960 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:06.913089 kubelet[2767]: E0209 20:38:06.912976 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:38:07.885713 env[1563]: time="2024-02-09T20:38:07.885582875Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:38:07.938402 env[1563]: time="2024-02-09T20:38:07.938296916Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:07.938628 kubelet[2767]: E0209 20:38:07.938577 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:38:07.938628 kubelet[2767]: E0209 20:38:07.938623 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:38:07.939051 kubelet[2767]: E0209 20:38:07.938667 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:07.939051 kubelet[2767]: E0209 20:38:07.938704 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:38:16.885891 env[1563]: time="2024-02-09T20:38:16.885703685Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:38:16.912790 env[1563]: time="2024-02-09T20:38:16.912713921Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:16.912882 kubelet[2767]: E0209 20:38:16.912872 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:38:16.913064 kubelet[2767]: E0209 20:38:16.912895 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:38:16.913064 kubelet[2767]: E0209 20:38:16.912918 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:16.913064 kubelet[2767]: E0209 20:38:16.912934 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:38:18.885371 env[1563]: time="2024-02-09T20:38:18.885237869Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:38:18.915977 env[1563]: time="2024-02-09T20:38:18.915924929Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:18.916179 kubelet[2767]: E0209 20:38:18.916166 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:38:18.916355 kubelet[2767]: E0209 20:38:18.916195 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:38:18.916355 kubelet[2767]: E0209 20:38:18.916218 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:18.916355 kubelet[2767]: E0209 20:38:18.916236 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:38:19.885169 env[1563]: time="2024-02-09T20:38:19.885065199Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:38:19.885169 env[1563]: time="2024-02-09T20:38:19.885075707Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:38:19.912032 env[1563]: time="2024-02-09T20:38:19.911995351Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:19.912149 env[1563]: time="2024-02-09T20:38:19.912056553Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:19.912212 kubelet[2767]: E0209 20:38:19.912198 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:38:19.912248 kubelet[2767]: E0209 20:38:19.912207 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:38:19.912248 kubelet[2767]: E0209 20:38:19.912228 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:38:19.912248 kubelet[2767]: E0209 20:38:19.912228 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:38:19.912312 kubelet[2767]: E0209 20:38:19.912251 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:19.912312 kubelet[2767]: E0209 20:38:19.912252 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:19.912312 kubelet[2767]: E0209 20:38:19.912268 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:38:19.912450 kubelet[2767]: E0209 20:38:19.912268 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:38:30.885733 env[1563]: time="2024-02-09T20:38:30.885646849Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:38:30.911529 env[1563]: time="2024-02-09T20:38:30.911467516Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:30.911693 kubelet[2767]: E0209 20:38:30.911645 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:38:30.911693 kubelet[2767]: E0209 20:38:30.911673 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:38:30.911693 kubelet[2767]: E0209 20:38:30.911695 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:30.911942 kubelet[2767]: E0209 20:38:30.911713 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:38:31.885692 env[1563]: time="2024-02-09T20:38:31.885603445Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:38:31.886010 env[1563]: time="2024-02-09T20:38:31.885603588Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:38:31.913508 env[1563]: time="2024-02-09T20:38:31.913443474Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:31.913828 kubelet[2767]: E0209 20:38:31.913791 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:38:31.913828 kubelet[2767]: E0209 20:38:31.913817 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:38:31.914027 kubelet[2767]: E0209 20:38:31.913838 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:31.914027 kubelet[2767]: E0209 20:38:31.913855 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:38:31.914725 env[1563]: time="2024-02-09T20:38:31.914683410Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:31.914855 kubelet[2767]: E0209 20:38:31.914820 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:38:31.914855 kubelet[2767]: E0209 20:38:31.914837 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:38:31.914855 kubelet[2767]: E0209 20:38:31.914854 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:31.914941 kubelet[2767]: E0209 20:38:31.914873 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:38:34.885140 env[1563]: time="2024-02-09T20:38:34.885018153Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:38:34.900020 env[1563]: time="2024-02-09T20:38:34.899943955Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:34.900136 kubelet[2767]: E0209 20:38:34.900126 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:38:34.900310 kubelet[2767]: E0209 20:38:34.900151 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:38:34.900310 kubelet[2767]: E0209 20:38:34.900173 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:34.900310 kubelet[2767]: E0209 20:38:34.900190 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:38:43.884934 env[1563]: time="2024-02-09T20:38:43.884799481Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:38:43.886051 env[1563]: time="2024-02-09T20:38:43.884920952Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:38:43.912254 env[1563]: time="2024-02-09T20:38:43.912218484Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:43.912427 env[1563]: time="2024-02-09T20:38:43.912232001Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:43.912494 kubelet[2767]: E0209 20:38:43.912481 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:38:43.912697 kubelet[2767]: E0209 20:38:43.912524 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:38:43.912697 kubelet[2767]: E0209 20:38:43.912546 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:43.912697 kubelet[2767]: E0209 20:38:43.912481 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:38:43.912697 kubelet[2767]: E0209 20:38:43.912564 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:38:43.912697 kubelet[2767]: E0209 20:38:43.912565 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:38:43.912831 kubelet[2767]: E0209 20:38:43.912585 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:43.912831 kubelet[2767]: E0209 20:38:43.912614 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:38:45.885768 env[1563]: time="2024-02-09T20:38:45.885614974Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:38:45.912296 env[1563]: time="2024-02-09T20:38:45.912219975Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:45.912458 kubelet[2767]: E0209 20:38:45.912443 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:38:45.912623 kubelet[2767]: E0209 20:38:45.912470 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:38:45.912623 kubelet[2767]: E0209 20:38:45.912491 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:45.912623 kubelet[2767]: E0209 20:38:45.912507 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:38:48.885773 env[1563]: time="2024-02-09T20:38:48.885648343Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:38:48.911740 env[1563]: time="2024-02-09T20:38:48.911682794Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:48.912002 kubelet[2767]: E0209 20:38:48.911967 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:38:48.912198 kubelet[2767]: E0209 20:38:48.912008 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:38:48.912198 kubelet[2767]: E0209 20:38:48.912031 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:48.912198 kubelet[2767]: E0209 20:38:48.912048 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:38:52.440471 systemd[1]: Started sshd@9-86.109.11.101:22-139.178.89.65:47182.service. Feb 9 20:38:52.439000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-86.109.11.101:22-139.178.89.65:47182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:38:52.468606 kernel: kauditd_printk_skb: 8 callbacks suppressed Feb 9 20:38:52.468706 kernel: audit: type=1130 audit(1707511132.439:286): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-86.109.11.101:22-139.178.89.65:47182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:38:52.587000 audit[7159]: USER_ACCT pid=7159 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.589357 sshd[7159]: Accepted publickey for core from 139.178.89.65 port 47182 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:38:52.591113 sshd[7159]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:38:52.593621 systemd-logind[1548]: New session 12 of user core. Feb 9 20:38:52.594040 systemd[1]: Started session-12.scope. Feb 9 20:38:52.589000 audit[7159]: CRED_ACQ pid=7159 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.772260 kernel: audit: type=1101 audit(1707511132.587:287): pid=7159 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.772311 kernel: audit: type=1103 audit(1707511132.589:288): pid=7159 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.772330 kernel: audit: type=1006 audit(1707511132.589:289): pid=7159 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Feb 9 20:38:52.774706 sshd[7159]: pam_unix(sshd:session): session closed for user core Feb 9 20:38:52.775943 systemd[1]: sshd@9-86.109.11.101:22-139.178.89.65:47182.service: Deactivated successfully. Feb 9 20:38:52.776574 systemd[1]: session-12.scope: Deactivated successfully. Feb 9 20:38:52.776636 systemd-logind[1548]: Session 12 logged out. Waiting for processes to exit. Feb 9 20:38:52.777133 systemd-logind[1548]: Removed session 12. Feb 9 20:38:52.831158 kernel: audit: type=1300 audit(1707511132.589:289): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa4c29ee0 a2=3 a3=0 items=0 ppid=1 pid=7159 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:38:52.589000 audit[7159]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa4c29ee0 a2=3 a3=0 items=0 ppid=1 pid=7159 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:38:52.589000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:38:52.954354 kernel: audit: type=1327 audit(1707511132.589:289): proctitle=737368643A20636F7265205B707269765D Feb 9 20:38:52.954432 kernel: audit: type=1105 audit(1707511132.594:290): pid=7159 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.594000 audit[7159]: USER_START pid=7159 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.595000 audit[7162]: CRED_ACQ pid=7162 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:53.138024 kernel: audit: type=1103 audit(1707511132.595:291): pid=7162 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:53.138069 kernel: audit: type=1106 audit(1707511132.773:292): pid=7159 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.773000 audit[7159]: USER_END pid=7159 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.773000 audit[7159]: CRED_DISP pid=7159 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:53.323308 kernel: audit: type=1104 audit(1707511132.773:293): pid=7159 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:52.774000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-86.109.11.101:22-139.178.89.65:47182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:38:56.884574 env[1563]: time="2024-02-09T20:38:56.884488818Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:38:56.952629 env[1563]: time="2024-02-09T20:38:56.952578819Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:56.952821 kubelet[2767]: E0209 20:38:56.952800 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:38:56.953123 kubelet[2767]: E0209 20:38:56.952848 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:38:56.953123 kubelet[2767]: E0209 20:38:56.952901 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:56.953123 kubelet[2767]: E0209 20:38:56.952940 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:38:57.782428 systemd[1]: Started sshd@10-86.109.11.101:22-139.178.89.65:47190.service. Feb 9 20:38:57.781000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-86.109.11.101:22-139.178.89.65:47190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:38:57.898526 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:38:57.898571 kernel: audit: type=1130 audit(1707511137.781:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-86.109.11.101:22-139.178.89.65:47190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:38:57.944000 audit[7221]: USER_ACCT pid=7221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:57.945915 sshd[7221]: Accepted publickey for core from 139.178.89.65 port 47190 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:38:57.948062 sshd[7221]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:38:57.953977 systemd-logind[1548]: New session 13 of user core. Feb 9 20:38:57.955205 systemd[1]: Started session-13.scope. Feb 9 20:38:57.946000 audit[7221]: CRED_ACQ pid=7221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.041392 sshd[7221]: pam_unix(sshd:session): session closed for user core Feb 9 20:38:58.042806 systemd[1]: sshd@10-86.109.11.101:22-139.178.89.65:47190.service: Deactivated successfully. Feb 9 20:38:58.043337 systemd-logind[1548]: Session 13 logged out. Waiting for processes to exit. Feb 9 20:38:58.043380 systemd[1]: session-13.scope: Deactivated successfully. Feb 9 20:38:58.043956 systemd-logind[1548]: Removed session 13. Feb 9 20:38:58.127664 kernel: audit: type=1101 audit(1707511137.944:296): pid=7221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.127702 kernel: audit: type=1103 audit(1707511137.946:297): pid=7221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.127717 kernel: audit: type=1006 audit(1707511137.946:298): pid=7221 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Feb 9 20:38:58.186104 kernel: audit: type=1300 audit(1707511137.946:298): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdf29a1fd0 a2=3 a3=0 items=0 ppid=1 pid=7221 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:38:57.946000 audit[7221]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdf29a1fd0 a2=3 a3=0 items=0 ppid=1 pid=7221 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:38:58.277971 kernel: audit: type=1327 audit(1707511137.946:298): proctitle=737368643A20636F7265205B707269765D Feb 9 20:38:57.946000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:38:58.308375 kernel: audit: type=1105 audit(1707511137.961:299): pid=7221 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:57.961000 audit[7221]: USER_START pid=7221 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.402499 kernel: audit: type=1103 audit(1707511137.962:300): pid=7224 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:57.962000 audit[7224]: CRED_ACQ pid=7224 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.491501 kernel: audit: type=1106 audit(1707511138.040:301): pid=7221 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.040000 audit[7221]: USER_END pid=7221 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.586767 kernel: audit: type=1104 audit(1707511138.040:302): pid=7221 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.040000 audit[7221]: CRED_DISP pid=7221 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:38:58.041000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-86.109.11.101:22-139.178.89.65:47190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:38:58.885262 env[1563]: time="2024-02-09T20:38:58.885062850Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:38:58.885262 env[1563]: time="2024-02-09T20:38:58.885084279Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:38:58.912266 env[1563]: time="2024-02-09T20:38:58.912225390Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:58.912413 env[1563]: time="2024-02-09T20:38:58.912296965Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:38:58.912501 kubelet[2767]: E0209 20:38:58.912440 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:38:58.912501 kubelet[2767]: E0209 20:38:58.912444 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:38:58.912501 kubelet[2767]: E0209 20:38:58.912498 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:38:58.912501 kubelet[2767]: E0209 20:38:58.912498 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:38:58.912756 kubelet[2767]: E0209 20:38:58.912520 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:58.912756 kubelet[2767]: E0209 20:38:58.912521 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:38:58.912756 kubelet[2767]: E0209 20:38:58.912537 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:38:58.912860 kubelet[2767]: E0209 20:38:58.912537 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:39:02.884672 env[1563]: time="2024-02-09T20:39:02.884619284Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:39:02.898778 env[1563]: time="2024-02-09T20:39:02.898717659Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:02.898894 kubelet[2767]: E0209 20:39:02.898852 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:39:02.898894 kubelet[2767]: E0209 20:39:02.898875 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:39:02.899147 kubelet[2767]: E0209 20:39:02.898904 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:02.899147 kubelet[2767]: E0209 20:39:02.898928 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:39:03.048763 systemd[1]: Started sshd@11-86.109.11.101:22-139.178.89.65:33310.service. Feb 9 20:39:03.047000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-86.109.11.101:22-139.178.89.65:33310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:03.075860 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:03.075939 kernel: audit: type=1130 audit(1707511143.047:304): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-86.109.11.101:22-139.178.89.65:33310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:03.194000 audit[7340]: USER_ACCT pid=7340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.196284 sshd[7340]: Accepted publickey for core from 139.178.89.65 port 33310 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:03.198620 sshd[7340]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:03.201060 systemd-logind[1548]: New session 14 of user core. Feb 9 20:39:03.201495 systemd[1]: Started session-14.scope. Feb 9 20:39:03.197000 audit[7340]: CRED_ACQ pid=7340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.377563 kernel: audit: type=1101 audit(1707511143.194:305): pid=7340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.377603 kernel: audit: type=1103 audit(1707511143.197:306): pid=7340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.377623 kernel: audit: type=1006 audit(1707511143.197:307): pid=7340 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Feb 9 20:39:03.436033 kernel: audit: type=1300 audit(1707511143.197:307): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd2b9c12c0 a2=3 a3=0 items=0 ppid=1 pid=7340 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:03.197000 audit[7340]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd2b9c12c0 a2=3 a3=0 items=0 ppid=1 pid=7340 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:03.197000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:03.528199 sshd[7340]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:03.528387 kernel: audit: type=1327 audit(1707511143.197:307): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:03.529677 systemd[1]: sshd@11-86.109.11.101:22-139.178.89.65:33310.service: Deactivated successfully. Feb 9 20:39:03.530292 systemd-logind[1548]: Session 14 logged out. Waiting for processes to exit. Feb 9 20:39:03.530304 systemd[1]: session-14.scope: Deactivated successfully. Feb 9 20:39:03.530740 systemd-logind[1548]: Removed session 14. Feb 9 20:39:03.202000 audit[7340]: USER_START pid=7340 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.652526 kernel: audit: type=1105 audit(1707511143.202:308): pid=7340 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.652561 kernel: audit: type=1103 audit(1707511143.203:309): pid=7343 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.203000 audit[7343]: CRED_ACQ pid=7343 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.527000 audit[7340]: USER_END pid=7340 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.836756 kernel: audit: type=1106 audit(1707511143.527:310): pid=7340 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.836796 kernel: audit: type=1104 audit(1707511143.527:311): pid=7340 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.527000 audit[7340]: CRED_DISP pid=7340 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:03.528000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-86.109.11.101:22-139.178.89.65:33310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:07.884747 env[1563]: time="2024-02-09T20:39:07.884610431Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:39:07.911780 env[1563]: time="2024-02-09T20:39:07.911723450Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:07.911871 kubelet[2767]: E0209 20:39:07.911810 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:39:07.911871 kubelet[2767]: E0209 20:39:07.911832 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:39:07.911871 kubelet[2767]: E0209 20:39:07.911854 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:07.911871 kubelet[2767]: E0209 20:39:07.911871 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:39:08.476367 systemd[1]: Started sshd@12-86.109.11.101:22-139.178.89.65:55998.service. Feb 9 20:39:08.476000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-86.109.11.101:22-139.178.89.65:55998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:08.503587 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:08.503681 kernel: audit: type=1130 audit(1707511148.476:313): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-86.109.11.101:22-139.178.89.65:55998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:08.622000 audit[7400]: USER_ACCT pid=7400 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.623296 sshd[7400]: Accepted publickey for core from 139.178.89.65 port 55998 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:08.624611 sshd[7400]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:08.626919 systemd-logind[1548]: New session 15 of user core. Feb 9 20:39:08.627360 systemd[1]: Started session-15.scope. Feb 9 20:39:08.707833 sshd[7400]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:08.709233 systemd[1]: sshd@12-86.109.11.101:22-139.178.89.65:55998.service: Deactivated successfully. Feb 9 20:39:08.709907 systemd[1]: session-15.scope: Deactivated successfully. Feb 9 20:39:08.709927 systemd-logind[1548]: Session 15 logged out. Waiting for processes to exit. Feb 9 20:39:08.710386 systemd-logind[1548]: Removed session 15. Feb 9 20:39:08.624000 audit[7400]: CRED_ACQ pid=7400 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.806692 kernel: audit: type=1101 audit(1707511148.622:314): pid=7400 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.806727 kernel: audit: type=1103 audit(1707511148.624:315): pid=7400 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.806743 kernel: audit: type=1006 audit(1707511148.624:316): pid=7400 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Feb 9 20:39:08.865185 kernel: audit: type=1300 audit(1707511148.624:316): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdee7eab70 a2=3 a3=0 items=0 ppid=1 pid=7400 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:08.624000 audit[7400]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdee7eab70 a2=3 a3=0 items=0 ppid=1 pid=7400 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:08.624000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:08.987540 kernel: audit: type=1327 audit(1707511148.624:316): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:08.987568 kernel: audit: type=1105 audit(1707511148.629:317): pid=7400 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.629000 audit[7400]: USER_START pid=7400 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:09.081718 kernel: audit: type=1103 audit(1707511148.629:318): pid=7403 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.629000 audit[7403]: CRED_ACQ pid=7403 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.708000 audit[7400]: USER_END pid=7400 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:09.266006 kernel: audit: type=1106 audit(1707511148.708:319): pid=7400 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:09.266044 kernel: audit: type=1104 audit(1707511148.708:320): pid=7400 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.708000 audit[7400]: CRED_DISP pid=7400 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:08.709000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-86.109.11.101:22-139.178.89.65:55998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:13.710140 systemd[1]: Started sshd@13-86.109.11.101:22-139.178.89.65:56014.service. Feb 9 20:39:13.709000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-86.109.11.101:22-139.178.89.65:56014 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:13.737007 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:13.737054 kernel: audit: type=1130 audit(1707511153.709:322): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-86.109.11.101:22-139.178.89.65:56014 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:13.856000 audit[7429]: USER_ACCT pid=7429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:13.857379 sshd[7429]: Accepted publickey for core from 139.178.89.65 port 56014 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:13.858618 sshd[7429]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:13.860932 systemd-logind[1548]: New session 16 of user core. Feb 9 20:39:13.861425 systemd[1]: Started session-16.scope. Feb 9 20:39:13.884416 env[1563]: time="2024-02-09T20:39:13.884369152Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:39:13.900113 env[1563]: time="2024-02-09T20:39:13.900052204Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:13.900250 kubelet[2767]: E0209 20:39:13.900191 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:39:13.900250 kubelet[2767]: E0209 20:39:13.900223 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:39:13.900468 kubelet[2767]: E0209 20:39:13.900252 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:13.900468 kubelet[2767]: E0209 20:39:13.900278 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:39:13.939361 sshd[7429]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:13.940709 systemd[1]: sshd@13-86.109.11.101:22-139.178.89.65:56014.service: Deactivated successfully. Feb 9 20:39:13.941261 systemd-logind[1548]: Session 16 logged out. Waiting for processes to exit. Feb 9 20:39:13.941312 systemd[1]: session-16.scope: Deactivated successfully. Feb 9 20:39:13.941786 systemd-logind[1548]: Removed session 16. Feb 9 20:39:13.858000 audit[7429]: CRED_ACQ pid=7429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:13.949407 kernel: audit: type=1101 audit(1707511153.856:323): pid=7429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:13.949438 kernel: audit: type=1103 audit(1707511153.858:324): pid=7429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:14.097345 kernel: audit: type=1006 audit(1707511153.858:325): pid=7429 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Feb 9 20:39:14.097376 kernel: audit: type=1300 audit(1707511153.858:325): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc051052d0 a2=3 a3=0 items=0 ppid=1 pid=7429 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:13.858000 audit[7429]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc051052d0 a2=3 a3=0 items=0 ppid=1 pid=7429 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:14.189311 kernel: audit: type=1327 audit(1707511153.858:325): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:13.858000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:14.219746 kernel: audit: type=1105 audit(1707511153.863:326): pid=7429 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:13.863000 audit[7429]: USER_START pid=7429 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:13.863000 audit[7432]: CRED_ACQ pid=7432 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:14.402829 kernel: audit: type=1103 audit(1707511153.863:327): pid=7432 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:14.402863 kernel: audit: type=1106 audit(1707511153.939:328): pid=7429 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:13.939000 audit[7429]: USER_END pid=7429 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:14.498107 kernel: audit: type=1104 audit(1707511153.939:329): pid=7429 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:13.939000 audit[7429]: CRED_DISP pid=7429 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:13.940000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-86.109.11.101:22-139.178.89.65:56014 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:14.885800 env[1563]: time="2024-02-09T20:39:14.885693830Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:39:14.885800 env[1563]: time="2024-02-09T20:39:14.885726878Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:39:14.912805 env[1563]: time="2024-02-09T20:39:14.912700769Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:14.912805 env[1563]: time="2024-02-09T20:39:14.912776416Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:14.912994 kubelet[2767]: E0209 20:39:14.912953 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:39:14.912994 kubelet[2767]: E0209 20:39:14.912963 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:39:14.913177 kubelet[2767]: E0209 20:39:14.912996 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:39:14.913177 kubelet[2767]: E0209 20:39:14.912997 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:39:14.913177 kubelet[2767]: E0209 20:39:14.913017 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:14.913177 kubelet[2767]: E0209 20:39:14.913018 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:14.913177 kubelet[2767]: E0209 20:39:14.913035 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:39:14.913314 kubelet[2767]: E0209 20:39:14.913035 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:39:18.946310 systemd[1]: Started sshd@14-86.109.11.101:22-139.178.89.65:36870.service. Feb 9 20:39:18.945000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-86.109.11.101:22-139.178.89.65:36870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:18.973489 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:18.973537 kernel: audit: type=1130 audit(1707511158.945:331): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-86.109.11.101:22-139.178.89.65:36870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:19.092000 audit[7546]: USER_ACCT pid=7546 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.092814 sshd[7546]: Accepted publickey for core from 139.178.89.65 port 36870 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:19.093614 sshd[7546]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:19.095995 systemd-logind[1548]: New session 17 of user core. Feb 9 20:39:19.096564 systemd[1]: Started session-17.scope. Feb 9 20:39:19.092000 audit[7546]: CRED_ACQ pid=7546 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.276474 kernel: audit: type=1101 audit(1707511159.092:332): pid=7546 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.276516 kernel: audit: type=1103 audit(1707511159.092:333): pid=7546 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.276532 kernel: audit: type=1006 audit(1707511159.092:334): pid=7546 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Feb 9 20:39:19.334975 kernel: audit: type=1300 audit(1707511159.092:334): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc06f2b60 a2=3 a3=0 items=0 ppid=1 pid=7546 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:19.092000 audit[7546]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc06f2b60 a2=3 a3=0 items=0 ppid=1 pid=7546 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:19.368351 sshd[7546]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:19.369842 systemd[1]: sshd@14-86.109.11.101:22-139.178.89.65:36870.service: Deactivated successfully. Feb 9 20:39:19.370504 systemd[1]: session-17.scope: Deactivated successfully. Feb 9 20:39:19.370516 systemd-logind[1548]: Session 17 logged out. Waiting for processes to exit. Feb 9 20:39:19.371088 systemd-logind[1548]: Removed session 17. Feb 9 20:39:19.092000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:19.457391 kernel: audit: type=1327 audit(1707511159.092:334): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:19.457425 kernel: audit: type=1105 audit(1707511159.097:335): pid=7546 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.097000 audit[7546]: USER_START pid=7546 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.551567 kernel: audit: type=1103 audit(1707511159.098:336): pid=7549 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.098000 audit[7549]: CRED_ACQ pid=7549 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.640541 kernel: audit: type=1106 audit(1707511159.367:337): pid=7546 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.367000 audit[7546]: USER_END pid=7546 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.367000 audit[7546]: CRED_DISP pid=7546 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.824941 kernel: audit: type=1104 audit(1707511159.367:338): pid=7546 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:19.368000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-86.109.11.101:22-139.178.89.65:36870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:19.884144 env[1563]: time="2024-02-09T20:39:19.884081667Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:39:19.898182 env[1563]: time="2024-02-09T20:39:19.898108840Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:19.898330 kubelet[2767]: E0209 20:39:19.898317 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:39:19.898557 kubelet[2767]: E0209 20:39:19.898353 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:39:19.898557 kubelet[2767]: E0209 20:39:19.898381 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:19.898557 kubelet[2767]: E0209 20:39:19.898401 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:39:24.371171 systemd[1]: Started sshd@15-86.109.11.101:22-139.178.89.65:36874.service. Feb 9 20:39:24.369000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-86.109.11.101:22-139.178.89.65:36874 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:24.398688 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:24.398890 kernel: audit: type=1130 audit(1707511164.369:340): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-86.109.11.101:22-139.178.89.65:36874 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:24.515000 audit[7602]: USER_ACCT pid=7602 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.516847 sshd[7602]: Accepted publickey for core from 139.178.89.65 port 36874 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:24.517615 sshd[7602]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:24.519873 systemd-logind[1548]: New session 18 of user core. Feb 9 20:39:24.520367 systemd[1]: Started session-18.scope. Feb 9 20:39:24.516000 audit[7602]: CRED_ACQ pid=7602 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.698170 kernel: audit: type=1101 audit(1707511164.515:341): pid=7602 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.698259 kernel: audit: type=1103 audit(1707511164.516:342): pid=7602 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.698286 kernel: audit: type=1006 audit(1707511164.516:343): pid=7602 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Feb 9 20:39:24.700973 sshd[7602]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:24.702930 systemd[1]: Started sshd@16-86.109.11.101:22-139.178.89.65:36886.service. Feb 9 20:39:24.703351 systemd[1]: sshd@15-86.109.11.101:22-139.178.89.65:36874.service: Deactivated successfully. Feb 9 20:39:24.703913 systemd-logind[1548]: Session 18 logged out. Waiting for processes to exit. Feb 9 20:39:24.703943 systemd[1]: session-18.scope: Deactivated successfully. Feb 9 20:39:24.704491 systemd-logind[1548]: Removed session 18. Feb 9 20:39:24.756730 kernel: audit: type=1300 audit(1707511164.516:343): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3330c7a0 a2=3 a3=0 items=0 ppid=1 pid=7602 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:24.516000 audit[7602]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3330c7a0 a2=3 a3=0 items=0 ppid=1 pid=7602 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:24.848630 kernel: audit: type=1327 audit(1707511164.516:343): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:24.516000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:24.878706 sshd[7629]: Accepted publickey for core from 139.178.89.65 port 36886 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:24.879032 kernel: audit: type=1105 audit(1707511164.521:344): pid=7602 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.521000 audit[7602]: USER_START pid=7602 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.881620 sshd[7629]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:24.883816 systemd-logind[1548]: New session 19 of user core. Feb 9 20:39:24.884258 systemd[1]: Started session-19.scope. Feb 9 20:39:24.884502 env[1563]: time="2024-02-09T20:39:24.884479209Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:39:24.899399 env[1563]: time="2024-02-09T20:39:24.899359065Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:24.899543 kubelet[2767]: E0209 20:39:24.899507 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:39:24.899543 kubelet[2767]: E0209 20:39:24.899533 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:39:24.899749 kubelet[2767]: E0209 20:39:24.899555 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:24.899749 kubelet[2767]: E0209 20:39:24.899573 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:39:24.973230 kernel: audit: type=1103 audit(1707511164.521:345): pid=7605 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.521000 audit[7605]: CRED_ACQ pid=7605 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.056471 sshd[7629]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:25.058213 systemd[1]: Started sshd@17-86.109.11.101:22-139.178.89.65:36896.service. Feb 9 20:39:25.058631 systemd[1]: sshd@16-86.109.11.101:22-139.178.89.65:36886.service: Deactivated successfully. Feb 9 20:39:25.059271 systemd[1]: session-19.scope: Deactivated successfully. Feb 9 20:39:25.059273 systemd-logind[1548]: Session 19 logged out. Waiting for processes to exit. Feb 9 20:39:25.060189 systemd-logind[1548]: Removed session 19. Feb 9 20:39:24.700000 audit[7602]: USER_END pid=7602 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.157463 kernel: audit: type=1106 audit(1707511164.700:346): pid=7602 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.157536 kernel: audit: type=1104 audit(1707511164.700:347): pid=7602 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.700000 audit[7602]: CRED_DISP pid=7602 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.701000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-86.109.11.101:22-139.178.89.65:36886 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:24.702000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-86.109.11.101:22-139.178.89.65:36874 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:24.877000 audit[7629]: USER_ACCT pid=7629 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.880000 audit[7629]: CRED_ACQ pid=7629 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.880000 audit[7629]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd282e9650 a2=3 a3=0 items=0 ppid=1 pid=7629 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:24.880000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:24.885000 audit[7629]: USER_START pid=7629 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:24.886000 audit[7649]: CRED_ACQ pid=7649 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.055000 audit[7629]: USER_END pid=7629 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.055000 audit[7629]: CRED_DISP pid=7629 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.057000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-86.109.11.101:22-139.178.89.65:36896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:25.057000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-86.109.11.101:22-139.178.89.65:36886 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:25.275000 audit[7680]: USER_ACCT pid=7680 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.277509 sshd[7680]: Accepted publickey for core from 139.178.89.65 port 36896 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:25.278000 audit[7680]: CRED_ACQ pid=7680 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.278000 audit[7680]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcca443890 a2=3 a3=0 items=0 ppid=1 pid=7680 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:25.278000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:25.280297 sshd[7680]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:25.289964 systemd-logind[1548]: New session 20 of user core. Feb 9 20:39:25.292311 systemd[1]: Started session-20.scope. Feb 9 20:39:25.306000 audit[7680]: USER_START pid=7680 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.309000 audit[7687]: CRED_ACQ pid=7687 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.437651 sshd[7680]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:25.437000 audit[7680]: USER_END pid=7680 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.437000 audit[7680]: CRED_DISP pid=7680 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:25.440017 systemd[1]: sshd@17-86.109.11.101:22-139.178.89.65:36896.service: Deactivated successfully. Feb 9 20:39:25.438000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-86.109.11.101:22-139.178.89.65:36896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:25.440766 systemd-logind[1548]: Session 20 logged out. Waiting for processes to exit. Feb 9 20:39:25.440815 systemd[1]: session-20.scope: Deactivated successfully. Feb 9 20:39:25.441326 systemd-logind[1548]: Removed session 20. Feb 9 20:39:26.885252 env[1563]: time="2024-02-09T20:39:26.885175178Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:39:26.934227 env[1563]: time="2024-02-09T20:39:26.934165545Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:26.934424 kubelet[2767]: E0209 20:39:26.934402 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:39:26.934753 kubelet[2767]: E0209 20:39:26.934446 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:39:26.934753 kubelet[2767]: E0209 20:39:26.934489 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:26.934753 kubelet[2767]: E0209 20:39:26.934522 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:39:29.885400 env[1563]: time="2024-02-09T20:39:29.885284727Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:39:29.911765 env[1563]: time="2024-02-09T20:39:29.911729815Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:29.911901 kubelet[2767]: E0209 20:39:29.911888 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:39:29.912097 kubelet[2767]: E0209 20:39:29.911919 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:39:29.912097 kubelet[2767]: E0209 20:39:29.911950 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:29.912097 kubelet[2767]: E0209 20:39:29.911978 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:39:30.444303 systemd[1]: Started sshd@18-86.109.11.101:22-139.178.89.65:56390.service. Feb 9 20:39:30.443000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-86.109.11.101:22-139.178.89.65:56390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:30.471514 kernel: kauditd_printk_skb: 23 callbacks suppressed Feb 9 20:39:30.471567 kernel: audit: type=1130 audit(1707511170.443:367): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-86.109.11.101:22-139.178.89.65:56390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:30.590000 audit[7771]: USER_ACCT pid=7771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.591553 sshd[7771]: Accepted publickey for core from 139.178.89.65 port 56390 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:30.592655 sshd[7771]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:30.595132 systemd-logind[1548]: New session 21 of user core. Feb 9 20:39:30.595665 systemd[1]: Started session-21.scope. Feb 9 20:39:30.673510 sshd[7771]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:30.674947 systemd[1]: sshd@18-86.109.11.101:22-139.178.89.65:56390.service: Deactivated successfully. Feb 9 20:39:30.675613 systemd[1]: session-21.scope: Deactivated successfully. Feb 9 20:39:30.675651 systemd-logind[1548]: Session 21 logged out. Waiting for processes to exit. Feb 9 20:39:30.676412 systemd-logind[1548]: Removed session 21. Feb 9 20:39:30.591000 audit[7771]: CRED_ACQ pid=7771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.773320 kernel: audit: type=1101 audit(1707511170.590:368): pid=7771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.773360 kernel: audit: type=1103 audit(1707511170.591:369): pid=7771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.773375 kernel: audit: type=1006 audit(1707511170.591:370): pid=7771 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Feb 9 20:39:30.591000 audit[7771]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdd5c91340 a2=3 a3=0 items=0 ppid=1 pid=7771 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:30.884500 env[1563]: time="2024-02-09T20:39:30.884482498Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:39:30.896155 env[1563]: time="2024-02-09T20:39:30.896123148Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:30.896406 kubelet[2767]: E0209 20:39:30.896299 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:39:30.896406 kubelet[2767]: E0209 20:39:30.896325 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:39:30.896406 kubelet[2767]: E0209 20:39:30.896357 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:30.896406 kubelet[2767]: E0209 20:39:30.896377 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:39:30.923729 kernel: audit: type=1300 audit(1707511170.591:370): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdd5c91340 a2=3 a3=0 items=0 ppid=1 pid=7771 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:30.923766 kernel: audit: type=1327 audit(1707511170.591:370): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:30.591000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:30.954170 kernel: audit: type=1105 audit(1707511170.596:371): pid=7771 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.596000 audit[7771]: USER_START pid=7771 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.597000 audit[7774]: CRED_ACQ pid=7774 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:31.137270 kernel: audit: type=1103 audit(1707511170.597:372): pid=7774 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:31.137305 kernel: audit: type=1106 audit(1707511170.672:373): pid=7771 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.672000 audit[7771]: USER_END pid=7771 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:31.232531 kernel: audit: type=1104 audit(1707511170.672:374): pid=7771 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.672000 audit[7771]: CRED_DISP pid=7771 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:30.673000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-86.109.11.101:22-139.178.89.65:56390 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:35.680538 systemd[1]: Started sshd@19-86.109.11.101:22-139.178.89.65:56394.service. Feb 9 20:39:35.680000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-86.109.11.101:22-139.178.89.65:56394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:35.707490 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:35.707558 kernel: audit: type=1130 audit(1707511175.680:376): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-86.109.11.101:22-139.178.89.65:56394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:35.827000 audit[7825]: USER_ACCT pid=7825 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:35.827726 sshd[7825]: Accepted publickey for core from 139.178.89.65 port 56394 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:35.829637 sshd[7825]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:35.832086 systemd-logind[1548]: New session 22 of user core. Feb 9 20:39:35.832511 systemd[1]: Started session-22.scope. Feb 9 20:39:35.912084 sshd[7825]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:35.913552 systemd[1]: sshd@19-86.109.11.101:22-139.178.89.65:56394.service: Deactivated successfully. Feb 9 20:39:35.914162 systemd-logind[1548]: Session 22 logged out. Waiting for processes to exit. Feb 9 20:39:35.914177 systemd[1]: session-22.scope: Deactivated successfully. Feb 9 20:39:35.914993 systemd-logind[1548]: Removed session 22. Feb 9 20:39:35.829000 audit[7825]: CRED_ACQ pid=7825 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:36.008999 kernel: audit: type=1101 audit(1707511175.827:377): pid=7825 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:36.009033 kernel: audit: type=1103 audit(1707511175.829:378): pid=7825 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:36.009051 kernel: audit: type=1006 audit(1707511175.829:379): pid=7825 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Feb 9 20:39:36.067423 kernel: audit: type=1300 audit(1707511175.829:379): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd19b54b0 a2=3 a3=0 items=0 ppid=1 pid=7825 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:35.829000 audit[7825]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd19b54b0 a2=3 a3=0 items=0 ppid=1 pid=7825 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:36.159298 kernel: audit: type=1327 audit(1707511175.829:379): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:35.829000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:36.189682 kernel: audit: type=1105 audit(1707511175.834:380): pid=7825 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:35.834000 audit[7825]: USER_START pid=7825 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:36.283839 kernel: audit: type=1103 audit(1707511175.835:381): pid=7828 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:35.835000 audit[7828]: CRED_ACQ pid=7828 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:36.372841 kernel: audit: type=1106 audit(1707511175.912:382): pid=7825 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:35.912000 audit[7825]: USER_END pid=7825 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:36.468083 kernel: audit: type=1104 audit(1707511175.912:383): pid=7825 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:35.912000 audit[7825]: CRED_DISP pid=7825 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:35.913000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-86.109.11.101:22-139.178.89.65:56394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:36.884847 env[1563]: time="2024-02-09T20:39:36.884720658Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:39:36.919753 env[1563]: time="2024-02-09T20:39:36.919709300Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:36.919995 kubelet[2767]: E0209 20:39:36.919947 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:39:36.919995 kubelet[2767]: E0209 20:39:36.919973 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:39:36.919995 kubelet[2767]: E0209 20:39:36.919993 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:36.920219 kubelet[2767]: E0209 20:39:36.920008 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:39:40.885762 env[1563]: time="2024-02-09T20:39:40.885628998Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:39:40.920271 systemd[1]: Started sshd@20-86.109.11.101:22-139.178.89.65:49234.service. Feb 9 20:39:40.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-86.109.11.101:22-139.178.89.65:49234 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:40.953502 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:40.953574 kernel: audit: type=1130 audit(1707511180.920:385): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-86.109.11.101:22-139.178.89.65:49234 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:40.953704 env[1563]: time="2024-02-09T20:39:40.953675336Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:40.953861 kubelet[2767]: E0209 20:39:40.953851 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:39:40.954046 kubelet[2767]: E0209 20:39:40.953874 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:39:40.954046 kubelet[2767]: E0209 20:39:40.953896 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:40.954046 kubelet[2767]: E0209 20:39:40.953913 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:39:41.071000 audit[7905]: USER_ACCT pid=7905 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.072117 sshd[7905]: Accepted publickey for core from 139.178.89.65 port 49234 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:41.073631 sshd[7905]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:41.076232 systemd-logind[1548]: New session 23 of user core. Feb 9 20:39:41.076678 systemd[1]: Started session-23.scope. Feb 9 20:39:41.154652 sshd[7905]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:41.156041 systemd[1]: sshd@20-86.109.11.101:22-139.178.89.65:49234.service: Deactivated successfully. Feb 9 20:39:41.156711 systemd[1]: session-23.scope: Deactivated successfully. Feb 9 20:39:41.156747 systemd-logind[1548]: Session 23 logged out. Waiting for processes to exit. Feb 9 20:39:41.157244 systemd-logind[1548]: Removed session 23. Feb 9 20:39:41.073000 audit[7905]: CRED_ACQ pid=7905 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.253524 kernel: audit: type=1101 audit(1707511181.071:386): pid=7905 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.253562 kernel: audit: type=1103 audit(1707511181.073:387): pid=7905 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.253578 kernel: audit: type=1006 audit(1707511181.073:388): pid=7905 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Feb 9 20:39:41.312024 kernel: audit: type=1300 audit(1707511181.073:388): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe8ade9270 a2=3 a3=0 items=0 ppid=1 pid=7905 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:41.073000 audit[7905]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe8ade9270 a2=3 a3=0 items=0 ppid=1 pid=7905 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:41.404069 kernel: audit: type=1327 audit(1707511181.073:388): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:41.073000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:41.434500 kernel: audit: type=1105 audit(1707511181.078:389): pid=7905 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.078000 audit[7905]: USER_START pid=7905 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.528740 kernel: audit: type=1103 audit(1707511181.079:390): pid=7913 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.079000 audit[7913]: CRED_ACQ pid=7913 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.154000 audit[7905]: USER_END pid=7905 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.713206 kernel: audit: type=1106 audit(1707511181.154:391): pid=7905 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.713256 kernel: audit: type=1104 audit(1707511181.155:392): pid=7905 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.155000 audit[7905]: CRED_DISP pid=7905 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:41.155000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-86.109.11.101:22-139.178.89.65:49234 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:41.884830 env[1563]: time="2024-02-09T20:39:41.884746649Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:39:41.885136 env[1563]: time="2024-02-09T20:39:41.884785914Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:39:41.909992 env[1563]: time="2024-02-09T20:39:41.909932638Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:41.909992 env[1563]: time="2024-02-09T20:39:41.909952303Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:41.910251 kubelet[2767]: E0209 20:39:41.910120 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:39:41.910251 kubelet[2767]: E0209 20:39:41.910152 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:39:41.910251 kubelet[2767]: E0209 20:39:41.910173 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:41.910251 kubelet[2767]: E0209 20:39:41.910192 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:39:41.910384 kubelet[2767]: E0209 20:39:41.910120 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:39:41.910384 kubelet[2767]: E0209 20:39:41.910210 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:39:41.910384 kubelet[2767]: E0209 20:39:41.910229 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:41.910384 kubelet[2767]: E0209 20:39:41.910244 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:39:46.161003 systemd[1]: Started sshd@21-86.109.11.101:22-139.178.89.65:49240.service. Feb 9 20:39:46.159000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-86.109.11.101:22-139.178.89.65:49240 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:46.188157 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:46.188203 kernel: audit: type=1130 audit(1707511186.159:394): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-86.109.11.101:22-139.178.89.65:49240 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:46.308000 audit[7993]: USER_ACCT pid=7993 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.308791 sshd[7993]: Accepted publickey for core from 139.178.89.65 port 49240 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:46.309614 sshd[7993]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:46.311982 systemd-logind[1548]: New session 24 of user core. Feb 9 20:39:46.312418 systemd[1]: Started session-24.scope. Feb 9 20:39:46.391808 sshd[7993]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:46.393194 systemd[1]: sshd@21-86.109.11.101:22-139.178.89.65:49240.service: Deactivated successfully. Feb 9 20:39:46.394048 systemd[1]: session-24.scope: Deactivated successfully. Feb 9 20:39:46.394078 systemd-logind[1548]: Session 24 logged out. Waiting for processes to exit. Feb 9 20:39:46.394730 systemd-logind[1548]: Removed session 24. Feb 9 20:39:46.308000 audit[7993]: CRED_ACQ pid=7993 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.492394 kernel: audit: type=1101 audit(1707511186.308:395): pid=7993 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.492429 kernel: audit: type=1103 audit(1707511186.308:396): pid=7993 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.492446 kernel: audit: type=1006 audit(1707511186.308:397): pid=7993 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Feb 9 20:39:46.550881 kernel: audit: type=1300 audit(1707511186.308:397): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6a62f6a0 a2=3 a3=0 items=0 ppid=1 pid=7993 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:46.308000 audit[7993]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6a62f6a0 a2=3 a3=0 items=0 ppid=1 pid=7993 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:46.642839 kernel: audit: type=1327 audit(1707511186.308:397): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:46.308000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:46.673246 kernel: audit: type=1105 audit(1707511186.313:398): pid=7993 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.313000 audit[7993]: USER_START pid=7993 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.767420 kernel: audit: type=1103 audit(1707511186.314:399): pid=7998 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.314000 audit[7998]: CRED_ACQ pid=7998 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.856422 kernel: audit: type=1106 audit(1707511186.391:400): pid=7993 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.391000 audit[7993]: USER_END pid=7993 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.391000 audit[7993]: CRED_DISP pid=7993 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:47.040829 kernel: audit: type=1104 audit(1707511186.391:401): pid=7993 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:46.392000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-86.109.11.101:22-139.178.89.65:49240 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:47.885524 env[1563]: time="2024-02-09T20:39:47.885434561Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:39:47.912541 env[1563]: time="2024-02-09T20:39:47.912440419Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:47.912800 kubelet[2767]: E0209 20:39:47.912749 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:39:47.912988 kubelet[2767]: E0209 20:39:47.912804 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:39:47.912988 kubelet[2767]: E0209 20:39:47.912826 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:47.912988 kubelet[2767]: E0209 20:39:47.912843 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:39:51.400277 systemd[1]: Started sshd@22-86.109.11.101:22-139.178.89.65:40352.service. Feb 9 20:39:51.399000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-86.109.11.101:22-139.178.89.65:40352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:51.427882 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:51.427964 kernel: audit: type=1130 audit(1707511191.399:403): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-86.109.11.101:22-139.178.89.65:40352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:51.546000 audit[8049]: USER_ACCT pid=8049 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.547981 sshd[8049]: Accepted publickey for core from 139.178.89.65 port 40352 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:51.548665 sshd[8049]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:51.551065 systemd-logind[1548]: New session 25 of user core. Feb 9 20:39:51.551497 systemd[1]: Started session-25.scope. Feb 9 20:39:51.634006 sshd[8049]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:51.635472 systemd[1]: sshd@22-86.109.11.101:22-139.178.89.65:40352.service: Deactivated successfully. Feb 9 20:39:51.636120 systemd-logind[1548]: Session 25 logged out. Waiting for processes to exit. Feb 9 20:39:51.636166 systemd[1]: session-25.scope: Deactivated successfully. Feb 9 20:39:51.636808 systemd-logind[1548]: Removed session 25. Feb 9 20:39:51.547000 audit[8049]: CRED_ACQ pid=8049 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.729353 kernel: audit: type=1101 audit(1707511191.546:404): pid=8049 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.729448 kernel: audit: type=1103 audit(1707511191.547:405): pid=8049 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.729465 kernel: audit: type=1006 audit(1707511191.547:406): pid=8049 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Feb 9 20:39:51.547000 audit[8049]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff29445880 a2=3 a3=0 items=0 ppid=1 pid=8049 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:51.879773 kernel: audit: type=1300 audit(1707511191.547:406): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff29445880 a2=3 a3=0 items=0 ppid=1 pid=8049 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:51.879808 kernel: audit: type=1327 audit(1707511191.547:406): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:51.547000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:51.884324 env[1563]: time="2024-02-09T20:39:51.884277884Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:39:51.896448 env[1563]: time="2024-02-09T20:39:51.896370192Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:51.896620 kubelet[2767]: E0209 20:39:51.896574 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:39:51.896620 kubelet[2767]: E0209 20:39:51.896600 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:39:51.896829 kubelet[2767]: E0209 20:39:51.896623 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:51.896829 kubelet[2767]: E0209 20:39:51.896642 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:39:51.910200 kernel: audit: type=1105 audit(1707511191.552:407): pid=8049 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.552000 audit[8049]: USER_START pid=8049 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:52.004412 kernel: audit: type=1103 audit(1707511191.552:408): pid=8052 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.552000 audit[8052]: CRED_ACQ pid=8052 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:52.093318 kernel: audit: type=1106 audit(1707511191.633:409): pid=8049 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.633000 audit[8049]: USER_END pid=8049 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:52.188561 kernel: audit: type=1104 audit(1707511191.633:410): pid=8049 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.633000 audit[8049]: CRED_DISP pid=8049 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:51.634000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-86.109.11.101:22-139.178.89.65:40352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:55.885006 env[1563]: time="2024-02-09T20:39:55.884887309Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:39:55.886084 env[1563]: time="2024-02-09T20:39:55.885048834Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:39:55.935262 env[1563]: time="2024-02-09T20:39:55.935202376Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:55.935461 kubelet[2767]: E0209 20:39:55.935439 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:39:55.935846 kubelet[2767]: E0209 20:39:55.935490 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:39:55.935846 kubelet[2767]: E0209 20:39:55.935534 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:55.935846 kubelet[2767]: E0209 20:39:55.935575 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:39:55.935846 kubelet[2767]: E0209 20:39:55.935642 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:39:55.935846 kubelet[2767]: E0209 20:39:55.935677 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:39:55.936163 env[1563]: time="2024-02-09T20:39:55.935489383Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:39:55.936222 kubelet[2767]: E0209 20:39:55.935718 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:39:55.936222 kubelet[2767]: E0209 20:39:55.935749 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:39:56.642106 systemd[1]: Started sshd@23-86.109.11.101:22-139.178.89.65:40364.service. Feb 9 20:39:56.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-86.109.11.101:22-139.178.89.65:40364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:56.671416 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:39:56.671515 kernel: audit: type=1130 audit(1707511196.641:412): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-86.109.11.101:22-139.178.89.65:40364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:39:56.822000 audit[8159]: USER_ACCT pid=8159 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:56.824474 sshd[8159]: Accepted publickey for core from 139.178.89.65 port 40364 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:39:56.828527 sshd[8159]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:39:56.838438 systemd-logind[1548]: New session 26 of user core. Feb 9 20:39:56.840659 systemd[1]: Started session-26.scope. Feb 9 20:39:56.826000 audit[8159]: CRED_ACQ pid=8159 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:56.937400 sshd[8159]: pam_unix(sshd:session): session closed for user core Feb 9 20:39:56.938825 systemd[1]: sshd@23-86.109.11.101:22-139.178.89.65:40364.service: Deactivated successfully. Feb 9 20:39:56.939377 systemd-logind[1548]: Session 26 logged out. Waiting for processes to exit. Feb 9 20:39:56.939411 systemd[1]: session-26.scope: Deactivated successfully. Feb 9 20:39:56.939923 systemd-logind[1548]: Removed session 26. Feb 9 20:39:57.006686 kernel: audit: type=1101 audit(1707511196.822:413): pid=8159 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:57.006781 kernel: audit: type=1103 audit(1707511196.826:414): pid=8159 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:57.006802 kernel: audit: type=1006 audit(1707511196.826:415): pid=8159 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Feb 9 20:39:57.065192 kernel: audit: type=1300 audit(1707511196.826:415): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffec752500 a2=3 a3=0 items=0 ppid=1 pid=8159 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:56.826000 audit[8159]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffec752500 a2=3 a3=0 items=0 ppid=1 pid=8159 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:39:57.157110 kernel: audit: type=1327 audit(1707511196.826:415): proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:56.826000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:39:57.187530 kernel: audit: type=1105 audit(1707511196.848:416): pid=8159 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:56.848000 audit[8159]: USER_START pid=8159 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:57.281707 kernel: audit: type=1103 audit(1707511196.850:417): pid=8162 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:56.850000 audit[8162]: CRED_ACQ pid=8162 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:56.936000 audit[8159]: USER_END pid=8159 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:57.465927 kernel: audit: type=1106 audit(1707511196.936:418): pid=8159 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:57.465977 kernel: audit: type=1104 audit(1707511196.936:419): pid=8159 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:56.936000 audit[8159]: CRED_DISP pid=8159 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:39:56.937000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-86.109.11.101:22-139.178.89.65:40364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:01.884571 env[1563]: time="2024-02-09T20:40:01.884518597Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:40:01.898275 env[1563]: time="2024-02-09T20:40:01.898217656Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:01.898403 kubelet[2767]: E0209 20:40:01.898381 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:40:01.898582 kubelet[2767]: E0209 20:40:01.898411 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:40:01.898582 kubelet[2767]: E0209 20:40:01.898433 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:01.898582 kubelet[2767]: E0209 20:40:01.898452 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:40:01.944804 systemd[1]: Started sshd@24-86.109.11.101:22-139.178.89.65:45632.service. Feb 9 20:40:01.944000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-86.109.11.101:22-139.178.89.65:45632 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:01.971407 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:40:01.971528 kernel: audit: type=1130 audit(1707511201.944:421): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-86.109.11.101:22-139.178.89.65:45632 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:02.091734 sshd[8211]: Accepted publickey for core from 139.178.89.65 port 45632 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:02.091000 audit[8211]: USER_ACCT pid=8211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.093617 sshd[8211]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:02.096031 systemd-logind[1548]: New session 27 of user core. Feb 9 20:40:02.096577 systemd[1]: Started session-27.scope. Feb 9 20:40:02.176849 sshd[8211]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:02.178473 systemd[1]: Started sshd@25-86.109.11.101:22-139.178.89.65:45634.service. Feb 9 20:40:02.178764 systemd[1]: sshd@24-86.109.11.101:22-139.178.89.65:45632.service: Deactivated successfully. Feb 9 20:40:02.179286 systemd-logind[1548]: Session 27 logged out. Waiting for processes to exit. Feb 9 20:40:02.179322 systemd[1]: session-27.scope: Deactivated successfully. Feb 9 20:40:02.179802 systemd-logind[1548]: Removed session 27. Feb 9 20:40:02.093000 audit[8211]: CRED_ACQ pid=8211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.273509 kernel: audit: type=1101 audit(1707511202.091:422): pid=8211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.273552 kernel: audit: type=1103 audit(1707511202.093:423): pid=8211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.274422 kernel: audit: type=1006 audit(1707511202.093:424): pid=8211 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Feb 9 20:40:02.303467 sshd[8235]: Accepted publickey for core from 139.178.89.65 port 45634 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:02.304643 sshd[8235]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:02.307574 systemd-logind[1548]: New session 28 of user core. Feb 9 20:40:02.308867 systemd[1]: Started session-28.scope. Feb 9 20:40:02.093000 audit[8211]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6cd8cbb0 a2=3 a3=0 items=0 ppid=1 pid=8211 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:02.424029 kernel: audit: type=1300 audit(1707511202.093:424): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6cd8cbb0 a2=3 a3=0 items=0 ppid=1 pid=8211 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:02.424107 kernel: audit: type=1327 audit(1707511202.093:424): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:02.093000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:02.098000 audit[8211]: USER_START pid=8211 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.548604 kernel: audit: type=1105 audit(1707511202.098:425): pid=8211 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.548640 kernel: audit: type=1103 audit(1707511202.099:426): pid=8214 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.099000 audit[8214]: CRED_ACQ pid=8214 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.637540 kernel: audit: type=1106 audit(1707511202.177:427): pid=8211 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.177000 audit[8211]: USER_END pid=8211 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.177000 audit[8211]: CRED_DISP pid=8211 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.783370 sshd[8235]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:02.784766 systemd[1]: Started sshd@26-86.109.11.101:22-139.178.89.65:45642.service. Feb 9 20:40:02.785452 systemd[1]: sshd@25-86.109.11.101:22-139.178.89.65:45634.service: Deactivated successfully. Feb 9 20:40:02.786005 systemd-logind[1548]: Session 28 logged out. Waiting for processes to exit. Feb 9 20:40:02.786036 systemd[1]: session-28.scope: Deactivated successfully. Feb 9 20:40:02.786572 systemd-logind[1548]: Removed session 28. Feb 9 20:40:02.822067 kernel: audit: type=1104 audit(1707511202.177:428): pid=8211 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.178000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-86.109.11.101:22-139.178.89.65:45634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:02.178000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-86.109.11.101:22-139.178.89.65:45632 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:02.303000 audit[8235]: USER_ACCT pid=8235 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.304000 audit[8235]: CRED_ACQ pid=8235 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.304000 audit[8235]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc160ec140 a2=3 a3=0 items=0 ppid=1 pid=8235 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:02.304000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:02.311000 audit[8235]: USER_START pid=8235 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.311000 audit[8239]: CRED_ACQ pid=8239 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.784000 audit[8235]: USER_END pid=8235 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.784000 audit[8235]: CRED_DISP pid=8235 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.784000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-86.109.11.101:22-139.178.89.65:45642 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:02.785000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-86.109.11.101:22-139.178.89.65:45634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:02.860000 audit[8257]: USER_ACCT pid=8257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.860522 sshd[8257]: Accepted publickey for core from 139.178.89.65 port 45642 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:02.860000 audit[8257]: CRED_ACQ pid=8257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.860000 audit[8257]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe671f600 a2=3 a3=0 items=0 ppid=1 pid=8257 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:02.860000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:02.861213 sshd[8257]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:02.863788 systemd-logind[1548]: New session 29 of user core. Feb 9 20:40:02.864256 systemd[1]: Started session-29.scope. Feb 9 20:40:02.866000 audit[8257]: USER_START pid=8257 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:02.866000 audit[8262]: CRED_ACQ pid=8262 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:03.813458 sshd[8257]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:03.813000 audit[8257]: USER_END pid=8257 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:03.813000 audit[8257]: CRED_DISP pid=8257 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:03.815429 systemd[1]: Started sshd@27-86.109.11.101:22-139.178.89.65:45648.service. Feb 9 20:40:03.815000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-86.109.11.101:22-139.178.89.65:45648 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:03.815816 systemd[1]: sshd@26-86.109.11.101:22-139.178.89.65:45642.service: Deactivated successfully. Feb 9 20:40:03.815000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-86.109.11.101:22-139.178.89.65:45642 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:03.816624 systemd-logind[1548]: Session 29 logged out. Waiting for processes to exit. Feb 9 20:40:03.816688 systemd[1]: session-29.scope: Deactivated successfully. Feb 9 20:40:03.817272 systemd-logind[1548]: Removed session 29. Feb 9 20:40:03.824000 audit[8316]: NETFILTER_CFG table=filter:115 family=2 entries=24 op=nft_register_rule pid=8316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:40:03.824000 audit[8316]: SYSCALL arch=c000003e syscall=46 success=yes exit=12476 a0=3 a1=7fff011d62e0 a2=0 a3=7fff011d62cc items=0 ppid=3048 pid=8316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:03.824000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:40:03.824000 audit[8316]: NETFILTER_CFG table=nat:116 family=2 entries=30 op=nft_register_rule pid=8316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:40:03.824000 audit[8316]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7fff011d62e0 a2=0 a3=31030 items=0 ppid=3048 pid=8316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:03.824000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:40:03.904000 audit[8306]: USER_ACCT pid=8306 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:03.904942 sshd[8306]: Accepted publickey for core from 139.178.89.65 port 45648 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:03.904000 audit[8343]: NETFILTER_CFG table=filter:117 family=2 entries=36 op=nft_register_rule pid=8343 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:40:03.904000 audit[8343]: SYSCALL arch=c000003e syscall=46 success=yes exit=12476 a0=3 a1=7fff52fd8060 a2=0 a3=7fff52fd804c items=0 ppid=3048 pid=8343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:03.904000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:40:03.905000 audit[8306]: CRED_ACQ pid=8306 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:03.905000 audit[8306]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe1da0890 a2=3 a3=0 items=0 ppid=1 pid=8306 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:03.905000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:03.906511 sshd[8306]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:03.912024 systemd-logind[1548]: New session 30 of user core. Feb 9 20:40:03.913151 systemd[1]: Started session-30.scope. Feb 9 20:40:03.905000 audit[8343]: NETFILTER_CFG table=nat:118 family=2 entries=30 op=nft_register_rule pid=8343 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:40:03.905000 audit[8343]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7fff52fd8060 a2=0 a3=31030 items=0 ppid=3048 pid=8343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:03.905000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:40:03.919000 audit[8306]: USER_START pid=8306 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:03.921000 audit[8345]: CRED_ACQ pid=8345 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.133660 sshd[8306]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:04.136000 audit[8306]: USER_END pid=8306 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.136000 audit[8306]: CRED_DISP pid=8306 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.140979 systemd[1]: Started sshd@28-86.109.11.101:22-139.178.89.65:45662.service. Feb 9 20:40:04.141000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-86.109.11.101:22-139.178.89.65:45662 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:04.143042 systemd[1]: sshd@27-86.109.11.101:22-139.178.89.65:45648.service: Deactivated successfully. Feb 9 20:40:04.143000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-86.109.11.101:22-139.178.89.65:45648 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:04.146156 systemd-logind[1548]: Session 30 logged out. Waiting for processes to exit. Feb 9 20:40:04.146315 systemd[1]: session-30.scope: Deactivated successfully. Feb 9 20:40:04.149080 systemd-logind[1548]: Removed session 30. Feb 9 20:40:04.208000 audit[8366]: USER_ACCT pid=8366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.209000 audit[8366]: CRED_ACQ pid=8366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.209000 audit[8366]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe942cbf60 a2=3 a3=0 items=0 ppid=1 pid=8366 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:04.209000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:04.210562 sshd[8366]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:04.223477 sshd[8366]: Accepted publickey for core from 139.178.89.65 port 45662 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:04.225895 systemd-logind[1548]: New session 31 of user core. Feb 9 20:40:04.226428 systemd[1]: Started session-31.scope. Feb 9 20:40:04.228000 audit[8366]: USER_START pid=8366 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.229000 audit[8370]: CRED_ACQ pid=8370 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.391087 sshd[8366]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:04.392000 audit[8366]: USER_END pid=8366 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.392000 audit[8366]: CRED_DISP pid=8366 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:04.394033 systemd[1]: sshd@28-86.109.11.101:22-139.178.89.65:45662.service: Deactivated successfully. Feb 9 20:40:04.394000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-86.109.11.101:22-139.178.89.65:45662 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:04.395323 systemd-logind[1548]: Session 31 logged out. Waiting for processes to exit. Feb 9 20:40:04.395375 systemd[1]: session-31.scope: Deactivated successfully. Feb 9 20:40:04.396619 systemd-logind[1548]: Removed session 31. Feb 9 20:40:05.885189 env[1563]: time="2024-02-09T20:40:05.885061673Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:40:05.914729 env[1563]: time="2024-02-09T20:40:05.914694427Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:05.914911 kubelet[2767]: E0209 20:40:05.914874 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:40:05.914911 kubelet[2767]: E0209 20:40:05.914900 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:40:05.915113 kubelet[2767]: E0209 20:40:05.914923 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:05.915113 kubelet[2767]: E0209 20:40:05.914942 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:40:09.397642 systemd[1]: Started sshd@29-86.109.11.101:22-139.178.89.65:44192.service. Feb 9 20:40:09.397000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-86.109.11.101:22-139.178.89.65:44192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:09.425328 kernel: kauditd_printk_skb: 57 callbacks suppressed Feb 9 20:40:09.425369 kernel: audit: type=1130 audit(1707511209.397:470): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-86.109.11.101:22-139.178.89.65:44192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:09.545000 audit[8424]: USER_ACCT pid=8424 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.545792 sshd[8424]: Accepted publickey for core from 139.178.89.65 port 44192 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:09.546614 sshd[8424]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:09.549127 systemd-logind[1548]: New session 32 of user core. Feb 9 20:40:09.549578 systemd[1]: Started session-32.scope. Feb 9 20:40:09.634004 sshd[8424]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:09.635463 systemd[1]: sshd@29-86.109.11.101:22-139.178.89.65:44192.service: Deactivated successfully. Feb 9 20:40:09.636089 systemd[1]: session-32.scope: Deactivated successfully. Feb 9 20:40:09.636116 systemd-logind[1548]: Session 32 logged out. Waiting for processes to exit. Feb 9 20:40:09.636660 systemd-logind[1548]: Removed session 32. Feb 9 20:40:09.546000 audit[8424]: CRED_ACQ pid=8424 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.727146 kernel: audit: type=1101 audit(1707511209.545:471): pid=8424 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.727178 kernel: audit: type=1103 audit(1707511209.546:472): pid=8424 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.727199 kernel: audit: type=1006 audit(1707511209.546:473): pid=8424 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=32 res=1 Feb 9 20:40:09.785626 kernel: audit: type=1300 audit(1707511209.546:473): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc44f6e7e0 a2=3 a3=0 items=0 ppid=1 pid=8424 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:09.546000 audit[8424]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc44f6e7e0 a2=3 a3=0 items=0 ppid=1 pid=8424 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:09.877540 kernel: audit: type=1327 audit(1707511209.546:473): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:09.546000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:09.907939 kernel: audit: type=1105 audit(1707511209.551:474): pid=8424 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.551000 audit[8424]: USER_START pid=8424 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:10.002096 kernel: audit: type=1103 audit(1707511209.552:475): pid=8427 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.552000 audit[8427]: CRED_ACQ pid=8427 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:10.090978 kernel: audit: type=1106 audit(1707511209.634:476): pid=8424 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.634000 audit[8424]: USER_END pid=8424 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:10.186188 kernel: audit: type=1104 audit(1707511209.634:477): pid=8424 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.634000 audit[8424]: CRED_DISP pid=8424 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:09.635000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-86.109.11.101:22-139.178.89.65:44192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:10.885066 env[1563]: time="2024-02-09T20:40:10.884968760Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:40:10.886005 env[1563]: time="2024-02-09T20:40:10.885175592Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:40:10.912722 env[1563]: time="2024-02-09T20:40:10.912637995Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:10.912722 env[1563]: time="2024-02-09T20:40:10.912631197Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:10.912870 kubelet[2767]: E0209 20:40:10.912843 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:40:10.913030 kubelet[2767]: E0209 20:40:10.912875 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:40:10.913030 kubelet[2767]: E0209 20:40:10.912897 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:10.913030 kubelet[2767]: E0209 20:40:10.912914 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:40:10.913030 kubelet[2767]: E0209 20:40:10.912842 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:40:10.913030 kubelet[2767]: E0209 20:40:10.912929 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:40:10.913157 kubelet[2767]: E0209 20:40:10.912943 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:10.913157 kubelet[2767]: E0209 20:40:10.912956 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:40:12.925000 audit[8530]: NETFILTER_CFG table=filter:119 family=2 entries=24 op=nft_register_rule pid=8530 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:40:12.925000 audit[8530]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7fffaec33da0 a2=0 a3=7fffaec33d8c items=0 ppid=3048 pid=8530 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:12.925000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:40:12.927000 audit[8530]: NETFILTER_CFG table=nat:120 family=2 entries=114 op=nft_register_chain pid=8530 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 9 20:40:12.927000 audit[8530]: SYSCALL arch=c000003e syscall=46 success=yes exit=50788 a0=3 a1=7fffaec33da0 a2=0 a3=7fffaec33d8c items=0 ppid=3048 pid=8530 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:12.927000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 9 20:40:14.640643 systemd[1]: Started sshd@30-86.109.11.101:22-139.178.89.65:44204.service. Feb 9 20:40:14.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-86.109.11.101:22-139.178.89.65:44204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:14.667567 kernel: kauditd_printk_skb: 7 callbacks suppressed Feb 9 20:40:14.667625 kernel: audit: type=1130 audit(1707511214.639:481): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-86.109.11.101:22-139.178.89.65:44204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:14.785000 audit[8532]: USER_ACCT pid=8532 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.787321 sshd[8532]: Accepted publickey for core from 139.178.89.65 port 44204 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:14.788660 sshd[8532]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:14.791242 systemd-logind[1548]: New session 33 of user core. Feb 9 20:40:14.791726 systemd[1]: Started session-33.scope. Feb 9 20:40:14.878066 sshd[8532]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:14.787000 audit[8532]: CRED_ACQ pid=8532 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.879408 systemd[1]: sshd@30-86.109.11.101:22-139.178.89.65:44204.service: Deactivated successfully. Feb 9 20:40:14.880043 systemd[1]: session-33.scope: Deactivated successfully. Feb 9 20:40:14.880078 systemd-logind[1548]: Session 33 logged out. Waiting for processes to exit. Feb 9 20:40:14.880608 systemd-logind[1548]: Removed session 33. Feb 9 20:40:14.969090 kernel: audit: type=1101 audit(1707511214.785:482): pid=8532 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.969131 kernel: audit: type=1103 audit(1707511214.787:483): pid=8532 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.969147 kernel: audit: type=1006 audit(1707511214.787:484): pid=8532 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=33 res=1 Feb 9 20:40:15.027603 kernel: audit: type=1300 audit(1707511214.787:484): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeb15d1d60 a2=3 a3=0 items=0 ppid=1 pid=8532 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:14.787000 audit[8532]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeb15d1d60 a2=3 a3=0 items=0 ppid=1 pid=8532 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:15.119480 kernel: audit: type=1327 audit(1707511214.787:484): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:14.787000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:15.149884 kernel: audit: type=1105 audit(1707511214.792:485): pid=8532 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.792000 audit[8532]: USER_START pid=8532 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:15.244037 kernel: audit: type=1103 audit(1707511214.793:486): pid=8535 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.793000 audit[8535]: CRED_ACQ pid=8535 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:15.332967 kernel: audit: type=1106 audit(1707511214.877:487): pid=8532 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.877000 audit[8532]: USER_END pid=8532 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:15.428190 kernel: audit: type=1104 audit(1707511214.877:488): pid=8532 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.877000 audit[8532]: CRED_DISP pid=8532 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:14.878000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-86.109.11.101:22-139.178.89.65:44204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:16.885832 env[1563]: time="2024-02-09T20:40:16.885691183Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:40:16.886757 env[1563]: time="2024-02-09T20:40:16.886314625Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:40:16.903821 env[1563]: time="2024-02-09T20:40:16.903786150Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:16.904000 kubelet[2767]: E0209 20:40:16.903984 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:40:16.904182 kubelet[2767]: E0209 20:40:16.904021 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:40:16.904182 kubelet[2767]: E0209 20:40:16.904055 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:16.904182 kubelet[2767]: E0209 20:40:16.904084 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:40:16.904314 env[1563]: time="2024-02-09T20:40:16.904283840Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:16.904428 kubelet[2767]: E0209 20:40:16.904389 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:40:16.904428 kubelet[2767]: E0209 20:40:16.904405 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:40:16.904428 kubelet[2767]: E0209 20:40:16.904427 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:16.904530 kubelet[2767]: E0209 20:40:16.904444 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:40:19.884829 systemd[1]: Started sshd@31-86.109.11.101:22-139.178.89.65:37486.service. Feb 9 20:40:19.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-86.109.11.101:22-139.178.89.65:37486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:19.911875 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:40:19.911922 kernel: audit: type=1130 audit(1707511219.883:490): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-86.109.11.101:22-139.178.89.65:37486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:20.030000 audit[8619]: USER_ACCT pid=8619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.032023 sshd[8619]: Accepted publickey for core from 139.178.89.65 port 37486 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:20.032880 sshd[8619]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:20.034983 systemd-logind[1548]: New session 34 of user core. Feb 9 20:40:20.035699 systemd[1]: Started session-34.scope. Feb 9 20:40:20.031000 audit[8619]: CRED_ACQ pid=8619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.125166 sshd[8619]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:20.126448 systemd[1]: sshd@31-86.109.11.101:22-139.178.89.65:37486.service: Deactivated successfully. Feb 9 20:40:20.127052 systemd-logind[1548]: Session 34 logged out. Waiting for processes to exit. Feb 9 20:40:20.127084 systemd[1]: session-34.scope: Deactivated successfully. Feb 9 20:40:20.127510 systemd-logind[1548]: Removed session 34. Feb 9 20:40:20.213415 kernel: audit: type=1101 audit(1707511220.030:491): pid=8619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.213461 kernel: audit: type=1103 audit(1707511220.031:492): pid=8619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.213474 kernel: audit: type=1006 audit(1707511220.031:493): pid=8619 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=34 res=1 Feb 9 20:40:20.271914 kernel: audit: type=1300 audit(1707511220.031:493): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc81a0a180 a2=3 a3=0 items=0 ppid=1 pid=8619 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:20.031000 audit[8619]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc81a0a180 a2=3 a3=0 items=0 ppid=1 pid=8619 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:20.363888 kernel: audit: type=1327 audit(1707511220.031:493): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:20.031000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:20.394332 kernel: audit: type=1105 audit(1707511220.036:494): pid=8619 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.036000 audit[8619]: USER_START pid=8619 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.488487 kernel: audit: type=1103 audit(1707511220.037:495): pid=8622 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.037000 audit[8622]: CRED_ACQ pid=8622 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.577417 kernel: audit: type=1106 audit(1707511220.124:496): pid=8619 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.124000 audit[8619]: USER_END pid=8619 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.672643 kernel: audit: type=1104 audit(1707511220.124:497): pid=8619 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.124000 audit[8619]: CRED_DISP pid=8619 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:20.125000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-86.109.11.101:22-139.178.89.65:37486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:25.130304 systemd[1]: Started sshd@32-86.109.11.101:22-139.178.89.65:37488.service. Feb 9 20:40:25.129000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-86.109.11.101:22-139.178.89.65:37488 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:25.166381 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:40:25.166431 kernel: audit: type=1130 audit(1707511225.129:499): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-86.109.11.101:22-139.178.89.65:37488 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:25.309000 audit[8645]: USER_ACCT pid=8645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.311725 sshd[8645]: Accepted publickey for core from 139.178.89.65 port 37488 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:25.315679 sshd[8645]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:25.325579 systemd-logind[1548]: New session 35 of user core. Feb 9 20:40:25.327775 systemd[1]: Started session-35.scope. Feb 9 20:40:25.313000 audit[8645]: CRED_ACQ pid=8645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.495199 kernel: audit: type=1101 audit(1707511225.309:500): pid=8645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.495278 kernel: audit: type=1103 audit(1707511225.313:501): pid=8645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.495294 kernel: audit: type=1006 audit(1707511225.313:502): pid=8645 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=35 res=1 Feb 9 20:40:25.497295 sshd[8645]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:25.498797 systemd[1]: sshd@32-86.109.11.101:22-139.178.89.65:37488.service: Deactivated successfully. Feb 9 20:40:25.499366 systemd-logind[1548]: Session 35 logged out. Waiting for processes to exit. Feb 9 20:40:25.499378 systemd[1]: session-35.scope: Deactivated successfully. Feb 9 20:40:25.499916 systemd-logind[1548]: Removed session 35. Feb 9 20:40:25.313000 audit[8645]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe44bc62f0 a2=3 a3=0 items=0 ppid=1 pid=8645 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:25.645697 kernel: audit: type=1300 audit(1707511225.313:502): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe44bc62f0 a2=3 a3=0 items=0 ppid=1 pid=8645 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:25.645809 kernel: audit: type=1327 audit(1707511225.313:502): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:25.313000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:25.337000 audit[8645]: USER_START pid=8645 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.770422 kernel: audit: type=1105 audit(1707511225.337:503): pid=8645 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.770480 kernel: audit: type=1103 audit(1707511225.339:504): pid=8648 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.339000 audit[8648]: CRED_ACQ pid=8648 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.859417 kernel: audit: type=1106 audit(1707511225.496:505): pid=8645 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.496000 audit[8645]: USER_END pid=8645 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.884168 env[1563]: time="2024-02-09T20:40:25.884132562Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:40:25.884168 env[1563]: time="2024-02-09T20:40:25.884132671Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:40:25.896309 env[1563]: time="2024-02-09T20:40:25.896273470Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:25.896472 kubelet[2767]: E0209 20:40:25.896427 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:40:25.896472 kubelet[2767]: E0209 20:40:25.896453 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:40:25.896472 kubelet[2767]: E0209 20:40:25.896475 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:25.896723 kubelet[2767]: E0209 20:40:25.896494 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:40:25.896761 env[1563]: time="2024-02-09T20:40:25.896714105Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:25.896790 kubelet[2767]: E0209 20:40:25.896782 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:40:25.896815 kubelet[2767]: E0209 20:40:25.896797 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:40:25.896835 kubelet[2767]: E0209 20:40:25.896817 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:25.896835 kubelet[2767]: E0209 20:40:25.896831 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:40:25.954740 kernel: audit: type=1104 audit(1707511225.496:506): pid=8645 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.496000 audit[8645]: CRED_DISP pid=8645 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:25.497000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-86.109.11.101:22-139.178.89.65:37488 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:28.885101 env[1563]: time="2024-02-09T20:40:28.885013302Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:40:28.911718 env[1563]: time="2024-02-09T20:40:28.911642967Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:28.911852 kubelet[2767]: E0209 20:40:28.911833 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:40:28.912026 kubelet[2767]: E0209 20:40:28.911859 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:40:28.912026 kubelet[2767]: E0209 20:40:28.911893 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:28.912026 kubelet[2767]: E0209 20:40:28.911910 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:40:29.884805 env[1563]: time="2024-02-09T20:40:29.884642858Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:40:29.911284 env[1563]: time="2024-02-09T20:40:29.911222323Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:29.911613 kubelet[2767]: E0209 20:40:29.911455 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:40:29.911613 kubelet[2767]: E0209 20:40:29.911486 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:40:29.911613 kubelet[2767]: E0209 20:40:29.911517 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:29.911613 kubelet[2767]: E0209 20:40:29.911542 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:40:30.504915 systemd[1]: Started sshd@33-86.109.11.101:22-139.178.89.65:53378.service. Feb 9 20:40:30.504000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-86.109.11.101:22-139.178.89.65:53378 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:30.532236 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:40:30.532269 kernel: audit: type=1130 audit(1707511230.504:508): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-86.109.11.101:22-139.178.89.65:53378 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:30.652000 audit[8785]: USER_ACCT pid=8785 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.652816 sshd[8785]: Accepted publickey for core from 139.178.89.65 port 53378 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:30.653618 sshd[8785]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:30.655744 systemd-logind[1548]: New session 36 of user core. Feb 9 20:40:30.656110 systemd[1]: Started session-36.scope. Feb 9 20:40:30.653000 audit[8785]: CRED_ACQ pid=8785 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.746118 sshd[8785]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:30.747601 systemd[1]: sshd@33-86.109.11.101:22-139.178.89.65:53378.service: Deactivated successfully. Feb 9 20:40:30.748234 systemd-logind[1548]: Session 36 logged out. Waiting for processes to exit. Feb 9 20:40:30.748292 systemd[1]: session-36.scope: Deactivated successfully. Feb 9 20:40:30.748828 systemd-logind[1548]: Removed session 36. Feb 9 20:40:30.834321 kernel: audit: type=1101 audit(1707511230.652:509): pid=8785 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.834363 kernel: audit: type=1103 audit(1707511230.653:510): pid=8785 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.834382 kernel: audit: type=1006 audit(1707511230.653:511): pid=8785 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=36 res=1 Feb 9 20:40:30.892835 kernel: audit: type=1300 audit(1707511230.653:511): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf554cf10 a2=3 a3=0 items=0 ppid=1 pid=8785 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:30.653000 audit[8785]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf554cf10 a2=3 a3=0 items=0 ppid=1 pid=8785 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:30.984935 kernel: audit: type=1327 audit(1707511230.653:511): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:30.653000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:31.015412 kernel: audit: type=1105 audit(1707511230.657:512): pid=8785 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.657000 audit[8785]: USER_START pid=8785 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.658000 audit[8788]: CRED_ACQ pid=8788 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:31.198542 kernel: audit: type=1103 audit(1707511230.658:513): pid=8788 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:31.198578 kernel: audit: type=1106 audit(1707511230.746:514): pid=8785 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.746000 audit[8785]: USER_END pid=8785 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:31.293798 kernel: audit: type=1104 audit(1707511230.746:515): pid=8785 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.746000 audit[8785]: CRED_DISP pid=8785 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:30.747000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-86.109.11.101:22-139.178.89.65:53378 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:35.752754 systemd[1]: Started sshd@34-86.109.11.101:22-139.178.89.65:53386.service. Feb 9 20:40:35.751000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-86.109.11.101:22-139.178.89.65:53386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:35.780043 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:40:35.780070 kernel: audit: type=1130 audit(1707511235.751:517): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-86.109.11.101:22-139.178.89.65:53386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:35.900000 audit[8812]: USER_ACCT pid=8812 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:35.901440 sshd[8812]: Accepted publickey for core from 139.178.89.65 port 53386 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:35.902604 sshd[8812]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:35.904915 systemd-logind[1548]: New session 37 of user core. Feb 9 20:40:35.905605 systemd[1]: Started session-37.scope. Feb 9 20:40:35.901000 audit[8812]: CRED_ACQ pid=8812 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.085320 kernel: audit: type=1101 audit(1707511235.900:518): pid=8812 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.085360 kernel: audit: type=1103 audit(1707511235.901:519): pid=8812 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.085376 kernel: audit: type=1006 audit(1707511235.901:520): pid=8812 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=37 res=1 Feb 9 20:40:36.143807 kernel: audit: type=1300 audit(1707511235.901:520): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc1af961a0 a2=3 a3=0 items=0 ppid=1 pid=8812 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:35.901000 audit[8812]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc1af961a0 a2=3 a3=0 items=0 ppid=1 pid=8812 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:36.235768 kernel: audit: type=1327 audit(1707511235.901:520): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:35.901000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:36.266187 kernel: audit: type=1105 audit(1707511235.906:521): pid=8812 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:35.906000 audit[8812]: USER_START pid=8812 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.266381 sshd[8812]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:36.267824 systemd[1]: sshd@34-86.109.11.101:22-139.178.89.65:53386.service: Deactivated successfully. Feb 9 20:40:36.268427 systemd-logind[1548]: Session 37 logged out. Waiting for processes to exit. Feb 9 20:40:36.268486 systemd[1]: session-37.scope: Deactivated successfully. Feb 9 20:40:36.268938 systemd-logind[1548]: Removed session 37. Feb 9 20:40:36.360323 kernel: audit: type=1103 audit(1707511235.907:522): pid=8815 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:35.907000 audit[8815]: CRED_ACQ pid=8815 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.449270 kernel: audit: type=1106 audit(1707511236.265:523): pid=8812 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.265000 audit[8812]: USER_END pid=8812 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.265000 audit[8812]: CRED_DISP pid=8812 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.633472 kernel: audit: type=1104 audit(1707511236.265:524): pid=8812 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:36.267000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-86.109.11.101:22-139.178.89.65:53386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:37.885622 env[1563]: time="2024-02-09T20:40:37.885499226Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:40:37.914856 env[1563]: time="2024-02-09T20:40:37.914801156Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:37.915071 kubelet[2767]: E0209 20:40:37.915060 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:40:37.915234 kubelet[2767]: E0209 20:40:37.915087 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:40:37.915234 kubelet[2767]: E0209 20:40:37.915109 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:37.915234 kubelet[2767]: E0209 20:40:37.915126 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:40:38.885116 env[1563]: time="2024-02-09T20:40:38.884982676Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:40:38.914146 env[1563]: time="2024-02-09T20:40:38.914085431Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:38.914447 kubelet[2767]: E0209 20:40:38.914269 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:40:38.914447 kubelet[2767]: E0209 20:40:38.914299 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:40:38.914447 kubelet[2767]: E0209 20:40:38.914320 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:38.914447 kubelet[2767]: E0209 20:40:38.914345 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:40:40.885280 env[1563]: time="2024-02-09T20:40:40.885169993Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:40:40.912589 env[1563]: time="2024-02-09T20:40:40.912508112Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:40.912773 kubelet[2767]: E0209 20:40:40.912761 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:40:40.913015 kubelet[2767]: E0209 20:40:40.912795 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:40:40.913015 kubelet[2767]: E0209 20:40:40.912832 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:40.913015 kubelet[2767]: E0209 20:40:40.912853 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:40:41.247486 systemd[1]: Started sshd@35-86.109.11.101:22-139.178.89.65:36900.service. Feb 9 20:40:41.246000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-86.109.11.101:22-139.178.89.65:36900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:41.275173 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:40:41.275257 kernel: audit: type=1130 audit(1707511241.246:526): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-86.109.11.101:22-139.178.89.65:36900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:41.381760 systemd[1]: Started sshd@36-86.109.11.101:22-218.92.0.59:32764.service. Feb 9 20:40:41.380000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-86.109.11.101:22-218.92.0.59:32764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:41.394381 sshd[8923]: Accepted publickey for core from 139.178.89.65 port 36900 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:41.395615 sshd[8923]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:41.397609 systemd-logind[1548]: New session 38 of user core. Feb 9 20:40:41.398291 systemd[1]: Started session-38.scope. Feb 9 20:40:41.392000 audit[8923]: USER_ACCT pid=8923 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.475212 sshd[8923]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:41.476732 systemd[1]: sshd@35-86.109.11.101:22-139.178.89.65:36900.service: Deactivated successfully. Feb 9 20:40:41.477347 systemd-logind[1548]: Session 38 logged out. Waiting for processes to exit. Feb 9 20:40:41.477409 systemd[1]: session-38.scope: Deactivated successfully. Feb 9 20:40:41.477864 systemd-logind[1548]: Removed session 38. Feb 9 20:40:41.561150 kernel: audit: type=1130 audit(1707511241.380:527): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-86.109.11.101:22-218.92.0.59:32764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:41.561196 kernel: audit: type=1101 audit(1707511241.392:528): pid=8923 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.561212 kernel: audit: type=1103 audit(1707511241.394:529): pid=8923 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.394000 audit[8923]: CRED_ACQ pid=8923 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.561289 sshd[8925]: Unable to negotiate with 218.92.0.59 port 32764: no matching key exchange method found. Their offer: diffie-hellman-group1-sha1,diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha1 [preauth] Feb 9 20:40:41.561677 systemd[1]: sshd@36-86.109.11.101:22-218.92.0.59:32764.service: Deactivated successfully. Feb 9 20:40:41.651255 kernel: audit: type=1006 audit(1707511241.394:530): pid=8923 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=38 res=1 Feb 9 20:40:41.709643 kernel: audit: type=1300 audit(1707511241.394:530): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe7bd3ba40 a2=3 a3=0 items=0 ppid=1 pid=8923 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:41.394000 audit[8923]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe7bd3ba40 a2=3 a3=0 items=0 ppid=1 pid=8923 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:41.801414 kernel: audit: type=1327 audit(1707511241.394:530): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:41.394000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:41.831743 kernel: audit: type=1105 audit(1707511241.399:531): pid=8923 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.399000 audit[8923]: USER_START pid=8923 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.926654 kernel: audit: type=1103 audit(1707511241.399:532): pid=8928 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.399000 audit[8928]: CRED_ACQ pid=8928 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.474000 audit[8923]: USER_END pid=8923 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:42.111417 kernel: audit: type=1106 audit(1707511241.474:533): pid=8923 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.474000 audit[8923]: CRED_DISP pid=8923 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:41.475000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-86.109.11.101:22-139.178.89.65:36900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:41.560000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-86.109.11.101:22-218.92.0.59:32764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:44.885502 env[1563]: time="2024-02-09T20:40:44.885413612Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:40:44.912169 env[1563]: time="2024-02-09T20:40:44.912109260Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:44.912298 kubelet[2767]: E0209 20:40:44.912287 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:40:44.912519 kubelet[2767]: E0209 20:40:44.912315 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:40:44.912519 kubelet[2767]: E0209 20:40:44.912342 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:44.912519 kubelet[2767]: E0209 20:40:44.912379 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:40:46.482159 systemd[1]: Started sshd@37-86.109.11.101:22-139.178.89.65:36912.service. Feb 9 20:40:46.481000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-86.109.11.101:22-139.178.89.65:36912 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:46.509423 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 9 20:40:46.509514 kernel: audit: type=1130 audit(1707511246.481:537): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-86.109.11.101:22-139.178.89.65:36912 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:46.629000 audit[8982]: USER_ACCT pid=8982 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.630199 sshd[8982]: Accepted publickey for core from 139.178.89.65 port 36912 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:46.631622 sshd[8982]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:46.633994 systemd-logind[1548]: New session 39 of user core. Feb 9 20:40:46.634473 systemd[1]: Started session-39.scope. Feb 9 20:40:46.631000 audit[8982]: CRED_ACQ pid=8982 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.727088 sshd[8982]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:46.728454 systemd[1]: sshd@37-86.109.11.101:22-139.178.89.65:36912.service: Deactivated successfully. Feb 9 20:40:46.729062 systemd-logind[1548]: Session 39 logged out. Waiting for processes to exit. Feb 9 20:40:46.729070 systemd[1]: session-39.scope: Deactivated successfully. Feb 9 20:40:46.729515 systemd-logind[1548]: Removed session 39. Feb 9 20:40:46.815280 kernel: audit: type=1101 audit(1707511246.629:538): pid=8982 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.815322 kernel: audit: type=1103 audit(1707511246.631:539): pid=8982 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.815347 kernel: audit: type=1006 audit(1707511246.631:540): pid=8982 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=39 res=1 Feb 9 20:40:46.874028 kernel: audit: type=1300 audit(1707511246.631:540): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc25c87440 a2=3 a3=0 items=0 ppid=1 pid=8982 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:46.631000 audit[8982]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc25c87440 a2=3 a3=0 items=0 ppid=1 pid=8982 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:46.631000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:46.996860 kernel: audit: type=1327 audit(1707511246.631:540): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:46.996889 kernel: audit: type=1105 audit(1707511246.636:541): pid=8982 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.636000 audit[8982]: USER_START pid=8982 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:47.091377 kernel: audit: type=1103 audit(1707511246.636:542): pid=8985 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.636000 audit[8985]: CRED_ACQ pid=8985 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:47.180663 kernel: audit: type=1106 audit(1707511246.727:543): pid=8982 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.727000 audit[8982]: USER_END pid=8982 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:47.276333 kernel: audit: type=1104 audit(1707511246.727:544): pid=8982 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.727000 audit[8982]: CRED_DISP pid=8982 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:46.728000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-86.109.11.101:22-139.178.89.65:36912 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:48.885012 env[1563]: time="2024-02-09T20:40:48.884900904Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:40:48.914072 env[1563]: time="2024-02-09T20:40:48.914010175Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:48.914265 kubelet[2767]: E0209 20:40:48.914254 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:40:48.914479 kubelet[2767]: E0209 20:40:48.914282 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:40:48.914479 kubelet[2767]: E0209 20:40:48.914303 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:48.914479 kubelet[2767]: E0209 20:40:48.914322 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:40:51.733215 systemd[1]: Started sshd@38-86.109.11.101:22-139.178.89.65:51320.service. Feb 9 20:40:51.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-86.109.11.101:22-139.178.89.65:51320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:51.760389 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:40:51.760487 kernel: audit: type=1130 audit(1707511251.731:546): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-86.109.11.101:22-139.178.89.65:51320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:51.879000 audit[9038]: USER_ACCT pid=9038 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:51.880832 sshd[9038]: Accepted publickey for core from 139.178.89.65 port 51320 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:51.881609 sshd[9038]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:51.883922 systemd-logind[1548]: New session 40 of user core. Feb 9 20:40:51.884446 systemd[1]: Started session-40.scope. Feb 9 20:40:51.962037 sshd[9038]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:51.963402 systemd[1]: sshd@38-86.109.11.101:22-139.178.89.65:51320.service: Deactivated successfully. Feb 9 20:40:51.964021 systemd[1]: session-40.scope: Deactivated successfully. Feb 9 20:40:51.964052 systemd-logind[1548]: Session 40 logged out. Waiting for processes to exit. Feb 9 20:40:51.964498 systemd-logind[1548]: Removed session 40. Feb 9 20:40:51.880000 audit[9038]: CRED_ACQ pid=9038 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:52.062809 kernel: audit: type=1101 audit(1707511251.879:547): pid=9038 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:52.062854 kernel: audit: type=1103 audit(1707511251.880:548): pid=9038 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:52.062870 kernel: audit: type=1006 audit(1707511251.880:549): pid=9038 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=40 res=1 Feb 9 20:40:52.121633 kernel: audit: type=1300 audit(1707511251.880:549): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeab9fc1b0 a2=3 a3=0 items=0 ppid=1 pid=9038 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:51.880000 audit[9038]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeab9fc1b0 a2=3 a3=0 items=0 ppid=1 pid=9038 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:52.213992 kernel: audit: type=1327 audit(1707511251.880:549): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:51.880000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:52.244581 kernel: audit: type=1105 audit(1707511251.885:550): pid=9038 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:51.885000 audit[9038]: USER_START pid=9038 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:52.339212 kernel: audit: type=1103 audit(1707511251.885:551): pid=9041 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:51.885000 audit[9041]: CRED_ACQ pid=9041 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:52.428166 kernel: audit: type=1106 audit(1707511251.961:552): pid=9038 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:51.961000 audit[9038]: USER_END pid=9038 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:52.523417 kernel: audit: type=1104 audit(1707511251.961:553): pid=9038 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:51.961000 audit[9038]: CRED_DISP pid=9038 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:51.962000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-86.109.11.101:22-139.178.89.65:51320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:52.885826 env[1563]: time="2024-02-09T20:40:52.885722544Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:40:52.911327 env[1563]: time="2024-02-09T20:40:52.911269546Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:52.911523 kubelet[2767]: E0209 20:40:52.911509 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:40:52.911696 kubelet[2767]: E0209 20:40:52.911536 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:40:52.911696 kubelet[2767]: E0209 20:40:52.911562 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:52.911696 kubelet[2767]: E0209 20:40:52.911580 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:40:54.884307 env[1563]: time="2024-02-09T20:40:54.884263722Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:40:54.905149 env[1563]: time="2024-02-09T20:40:54.905116661Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:54.905314 kubelet[2767]: E0209 20:40:54.905300 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:40:54.905571 kubelet[2767]: E0209 20:40:54.905336 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:40:54.905571 kubelet[2767]: E0209 20:40:54.905424 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:54.905571 kubelet[2767]: E0209 20:40:54.905453 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:40:56.884858 env[1563]: time="2024-02-09T20:40:56.884743868Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:40:56.937721 env[1563]: time="2024-02-09T20:40:56.937598362Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:40:56.938000 kubelet[2767]: E0209 20:40:56.937954 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:40:56.938637 kubelet[2767]: E0209 20:40:56.938025 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:40:56.938637 kubelet[2767]: E0209 20:40:56.938099 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:40:56.938637 kubelet[2767]: E0209 20:40:56.938157 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:40:56.969093 systemd[1]: Started sshd@39-86.109.11.101:22-139.178.89.65:51332.service. Feb 9 20:40:56.968000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-86.109.11.101:22-139.178.89.65:51332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:57.007238 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:40:57.007327 kernel: audit: type=1130 audit(1707511256.968:555): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-86.109.11.101:22-139.178.89.65:51332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:40:57.140000 audit[9156]: USER_ACCT pid=9156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.142238 sshd[9156]: Accepted publickey for core from 139.178.89.65 port 51332 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:40:57.144166 sshd[9156]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:40:57.149921 systemd-logind[1548]: New session 41 of user core. Feb 9 20:40:57.151187 systemd[1]: Started session-41.scope. Feb 9 20:40:57.142000 audit[9156]: CRED_ACQ pid=9156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.233806 sshd[9156]: pam_unix(sshd:session): session closed for user core Feb 9 20:40:57.235236 systemd[1]: sshd@39-86.109.11.101:22-139.178.89.65:51332.service: Deactivated successfully. Feb 9 20:40:57.235897 systemd[1]: session-41.scope: Deactivated successfully. Feb 9 20:40:57.235900 systemd-logind[1548]: Session 41 logged out. Waiting for processes to exit. Feb 9 20:40:57.236301 systemd-logind[1548]: Removed session 41. Feb 9 20:40:57.323416 kernel: audit: type=1101 audit(1707511257.140:556): pid=9156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.323458 kernel: audit: type=1103 audit(1707511257.142:557): pid=9156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.323476 kernel: audit: type=1006 audit(1707511257.142:558): pid=9156 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=41 res=1 Feb 9 20:40:57.381948 kernel: audit: type=1300 audit(1707511257.142:558): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef1aa1180 a2=3 a3=0 items=0 ppid=1 pid=9156 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:57.142000 audit[9156]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef1aa1180 a2=3 a3=0 items=0 ppid=1 pid=9156 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:40:57.473884 kernel: audit: type=1327 audit(1707511257.142:558): proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:57.142000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:40:57.504346 kernel: audit: type=1105 audit(1707511257.156:559): pid=9156 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.156000 audit[9156]: USER_START pid=9156 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.598547 kernel: audit: type=1103 audit(1707511257.157:560): pid=9159 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.157000 audit[9159]: CRED_ACQ pid=9159 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.687650 kernel: audit: type=1106 audit(1707511257.233:561): pid=9156 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.233000 audit[9156]: USER_END pid=9156 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.782946 kernel: audit: type=1104 audit(1707511257.233:562): pid=9156 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.233000 audit[9156]: CRED_DISP pid=9156 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:40:57.234000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-86.109.11.101:22-139.178.89.65:51332 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:02.240433 systemd[1]: Started sshd@40-86.109.11.101:22-139.178.89.65:55112.service. Feb 9 20:41:02.240000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-86.109.11.101:22-139.178.89.65:55112 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:02.267450 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:02.267554 kernel: audit: type=1130 audit(1707511262.240:564): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-86.109.11.101:22-139.178.89.65:55112 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:02.388000 audit[9182]: USER_ACCT pid=9182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.388839 sshd[9182]: Accepted publickey for core from 139.178.89.65 port 55112 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:02.392209 sshd[9182]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:02.397628 systemd-logind[1548]: New session 42 of user core. Feb 9 20:41:02.398146 systemd[1]: Started session-42.scope. Feb 9 20:41:02.476315 sshd[9182]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:02.477625 systemd[1]: sshd@40-86.109.11.101:22-139.178.89.65:55112.service: Deactivated successfully. Feb 9 20:41:02.478246 systemd-logind[1548]: Session 42 logged out. Waiting for processes to exit. Feb 9 20:41:02.478280 systemd[1]: session-42.scope: Deactivated successfully. Feb 9 20:41:02.478845 systemd-logind[1548]: Removed session 42. Feb 9 20:41:02.391000 audit[9182]: CRED_ACQ pid=9182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.570357 kernel: audit: type=1101 audit(1707511262.388:565): pid=9182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.570395 kernel: audit: type=1103 audit(1707511262.391:566): pid=9182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.570413 kernel: audit: type=1006 audit(1707511262.391:567): pid=9182 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=42 res=1 Feb 9 20:41:02.628815 kernel: audit: type=1300 audit(1707511262.391:567): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf5c43540 a2=3 a3=0 items=0 ppid=1 pid=9182 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:02.391000 audit[9182]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf5c43540 a2=3 a3=0 items=0 ppid=1 pid=9182 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:02.720687 kernel: audit: type=1327 audit(1707511262.391:567): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:02.391000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:02.751078 kernel: audit: type=1105 audit(1707511262.400:568): pid=9182 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.400000 audit[9182]: USER_START pid=9182 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.845257 kernel: audit: type=1103 audit(1707511262.400:569): pid=9185 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.400000 audit[9185]: CRED_ACQ pid=9185 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.934201 kernel: audit: type=1106 audit(1707511262.476:570): pid=9182 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.476000 audit[9182]: USER_END pid=9182 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.476000 audit[9182]: CRED_DISP pid=9182 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:03.118498 kernel: audit: type=1104 audit(1707511262.476:571): pid=9182 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:02.477000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-86.109.11.101:22-139.178.89.65:55112 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:03.885197 env[1563]: time="2024-02-09T20:41:03.885127774Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:41:03.898250 env[1563]: time="2024-02-09T20:41:03.898211777Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:03.898422 kubelet[2767]: E0209 20:41:03.898390 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:41:03.898422 kubelet[2767]: E0209 20:41:03.898416 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:41:03.898658 kubelet[2767]: E0209 20:41:03.898440 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:03.898658 kubelet[2767]: E0209 20:41:03.898461 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:41:05.885497 env[1563]: time="2024-02-09T20:41:05.885337622Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:41:05.911476 env[1563]: time="2024-02-09T20:41:05.911350335Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:05.911660 kubelet[2767]: E0209 20:41:05.911603 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:41:05.911660 kubelet[2767]: E0209 20:41:05.911643 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:41:05.911850 kubelet[2767]: E0209 20:41:05.911666 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:05.911850 kubelet[2767]: E0209 20:41:05.911683 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:41:07.481624 systemd[1]: Started sshd@41-86.109.11.101:22-139.178.89.65:55122.service. Feb 9 20:41:07.480000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-86.109.11.101:22-139.178.89.65:55122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:07.508954 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:07.509022 kernel: audit: type=1130 audit(1707511267.480:573): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-86.109.11.101:22-139.178.89.65:55122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:07.627000 audit[9267]: USER_ACCT pid=9267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.628789 sshd[9267]: Accepted publickey for core from 139.178.89.65 port 55122 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:07.629963 sshd[9267]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:07.632613 systemd-logind[1548]: New session 43 of user core. Feb 9 20:41:07.633212 systemd[1]: Started session-43.scope. Feb 9 20:41:07.712642 sshd[9267]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:07.714122 systemd[1]: sshd@41-86.109.11.101:22-139.178.89.65:55122.service: Deactivated successfully. Feb 9 20:41:07.714832 systemd[1]: session-43.scope: Deactivated successfully. Feb 9 20:41:07.714865 systemd-logind[1548]: Session 43 logged out. Waiting for processes to exit. Feb 9 20:41:07.715389 systemd-logind[1548]: Removed session 43. Feb 9 20:41:07.628000 audit[9267]: CRED_ACQ pid=9267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.812687 kernel: audit: type=1101 audit(1707511267.627:574): pid=9267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.812726 kernel: audit: type=1103 audit(1707511267.628:575): pid=9267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.812743 kernel: audit: type=1006 audit(1707511267.628:576): pid=9267 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=43 res=1 Feb 9 20:41:07.871151 kernel: audit: type=1300 audit(1707511267.628:576): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee84506e0 a2=3 a3=0 items=0 ppid=1 pid=9267 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:07.628000 audit[9267]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee84506e0 a2=3 a3=0 items=0 ppid=1 pid=9267 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:07.963077 kernel: audit: type=1327 audit(1707511267.628:576): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:07.628000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:07.993471 kernel: audit: type=1105 audit(1707511267.634:577): pid=9267 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.634000 audit[9267]: USER_START pid=9267 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:08.087619 kernel: audit: type=1103 audit(1707511267.634:578): pid=9270 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.634000 audit[9270]: CRED_ACQ pid=9270 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:08.176542 kernel: audit: type=1106 audit(1707511267.712:579): pid=9267 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.712000 audit[9267]: USER_END pid=9267 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:08.271850 kernel: audit: type=1104 audit(1707511267.712:580): pid=9267 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.712000 audit[9267]: CRED_DISP pid=9267 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:07.712000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-86.109.11.101:22-139.178.89.65:55122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:08.885237 env[1563]: time="2024-02-09T20:41:08.885102791Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:41:08.911112 env[1563]: time="2024-02-09T20:41:08.911047852Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:08.911226 kubelet[2767]: E0209 20:41:08.911212 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:41:08.911420 kubelet[2767]: E0209 20:41:08.911241 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:41:08.911420 kubelet[2767]: E0209 20:41:08.911264 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:08.911420 kubelet[2767]: E0209 20:41:08.911281 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:41:10.885207 env[1563]: time="2024-02-09T20:41:10.885115335Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:41:10.911865 env[1563]: time="2024-02-09T20:41:10.911801306Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:10.912054 kubelet[2767]: E0209 20:41:10.912022 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:41:10.912054 kubelet[2767]: E0209 20:41:10.912049 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:41:10.912262 kubelet[2767]: E0209 20:41:10.912073 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:10.912262 kubelet[2767]: E0209 20:41:10.912090 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:41:12.719026 systemd[1]: Started sshd@42-86.109.11.101:22-139.178.89.65:59888.service. Feb 9 20:41:12.717000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-86.109.11.101:22-139.178.89.65:59888 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:12.746072 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:12.746140 kernel: audit: type=1130 audit(1707511272.717:582): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-86.109.11.101:22-139.178.89.65:59888 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:12.865000 audit[9349]: USER_ACCT pid=9349 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:12.866808 sshd[9349]: Accepted publickey for core from 139.178.89.65 port 59888 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:12.867642 sshd[9349]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:12.869966 systemd-logind[1548]: New session 44 of user core. Feb 9 20:41:12.870445 systemd[1]: Started session-44.scope. Feb 9 20:41:12.949205 sshd[9349]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:12.950759 systemd[1]: sshd@42-86.109.11.101:22-139.178.89.65:59888.service: Deactivated successfully. Feb 9 20:41:12.951309 systemd-logind[1548]: Session 44 logged out. Waiting for processes to exit. Feb 9 20:41:12.951353 systemd[1]: session-44.scope: Deactivated successfully. Feb 9 20:41:12.951872 systemd-logind[1548]: Removed session 44. Feb 9 20:41:12.866000 audit[9349]: CRED_ACQ pid=9349 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:13.049328 kernel: audit: type=1101 audit(1707511272.865:583): pid=9349 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:13.049369 kernel: audit: type=1103 audit(1707511272.866:584): pid=9349 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:13.049385 kernel: audit: type=1006 audit(1707511272.866:585): pid=9349 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=44 res=1 Feb 9 20:41:12.866000 audit[9349]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffde29a5350 a2=3 a3=0 items=0 ppid=1 pid=9349 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:13.199810 kernel: audit: type=1300 audit(1707511272.866:585): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffde29a5350 a2=3 a3=0 items=0 ppid=1 pid=9349 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:13.199842 kernel: audit: type=1327 audit(1707511272.866:585): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:12.866000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:13.230265 kernel: audit: type=1105 audit(1707511272.871:586): pid=9349 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:12.871000 audit[9349]: USER_START pid=9349 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:12.871000 audit[9352]: CRED_ACQ pid=9352 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:13.413460 kernel: audit: type=1103 audit(1707511272.871:587): pid=9352 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:13.413498 kernel: audit: type=1106 audit(1707511272.948:588): pid=9349 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:12.948000 audit[9349]: USER_END pid=9349 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:13.508768 kernel: audit: type=1104 audit(1707511272.948:589): pid=9349 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:12.948000 audit[9349]: CRED_DISP pid=9349 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:12.949000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-86.109.11.101:22-139.178.89.65:59888 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:17.956271 systemd[1]: Started sshd@43-86.109.11.101:22-139.178.89.65:59902.service. Feb 9 20:41:17.955000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-86.109.11.101:22-139.178.89.65:59902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:17.983565 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:17.983648 kernel: audit: type=1130 audit(1707511277.955:591): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-86.109.11.101:22-139.178.89.65:59902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:18.101000 audit[9377]: USER_ACCT pid=9377 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.102391 sshd[9377]: Accepted publickey for core from 139.178.89.65 port 59902 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:18.103617 sshd[9377]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:18.105960 systemd-logind[1548]: New session 45 of user core. Feb 9 20:41:18.106403 systemd[1]: Started session-45.scope. Feb 9 20:41:18.184526 sshd[9377]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:18.186015 systemd[1]: sshd@43-86.109.11.101:22-139.178.89.65:59902.service: Deactivated successfully. Feb 9 20:41:18.186702 systemd[1]: session-45.scope: Deactivated successfully. Feb 9 20:41:18.186741 systemd-logind[1548]: Session 45 logged out. Waiting for processes to exit. Feb 9 20:41:18.187250 systemd-logind[1548]: Removed session 45. Feb 9 20:41:18.102000 audit[9377]: CRED_ACQ pid=9377 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.283806 kernel: audit: type=1101 audit(1707511278.101:592): pid=9377 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.283851 kernel: audit: type=1103 audit(1707511278.102:593): pid=9377 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.283872 kernel: audit: type=1006 audit(1707511278.102:594): pid=9377 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=45 res=1 Feb 9 20:41:18.102000 audit[9377]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe4b565c90 a2=3 a3=0 items=0 ppid=1 pid=9377 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:18.434343 kernel: audit: type=1300 audit(1707511278.102:594): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe4b565c90 a2=3 a3=0 items=0 ppid=1 pid=9377 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:18.434379 kernel: audit: type=1327 audit(1707511278.102:594): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:18.102000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:18.464773 kernel: audit: type=1105 audit(1707511278.107:595): pid=9377 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.107000 audit[9377]: USER_START pid=9377 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.558945 kernel: audit: type=1103 audit(1707511278.108:596): pid=9380 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.108000 audit[9380]: CRED_ACQ pid=9380 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.183000 audit[9377]: USER_END pid=9377 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.743148 kernel: audit: type=1106 audit(1707511278.183:597): pid=9377 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.743184 kernel: audit: type=1104 audit(1707511278.183:598): pid=9377 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.183000 audit[9377]: CRED_DISP pid=9377 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:18.184000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-86.109.11.101:22-139.178.89.65:59902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:18.883993 env[1563]: time="2024-02-09T20:41:18.883962645Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:41:18.883993 env[1563]: time="2024-02-09T20:41:18.883963442Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:41:18.903514 env[1563]: time="2024-02-09T20:41:18.903433532Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:18.903721 kubelet[2767]: E0209 20:41:18.903665 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:41:18.903721 kubelet[2767]: E0209 20:41:18.903700 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:41:18.904011 kubelet[2767]: E0209 20:41:18.903729 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:18.904011 kubelet[2767]: E0209 20:41:18.903756 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:41:18.904011 kubelet[2767]: E0209 20:41:18.903928 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:41:18.904011 kubelet[2767]: E0209 20:41:18.903942 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:41:18.904203 env[1563]: time="2024-02-09T20:41:18.903795882Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:18.904241 kubelet[2767]: E0209 20:41:18.903967 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:18.904241 kubelet[2767]: E0209 20:41:18.903986 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:41:21.885254 env[1563]: time="2024-02-09T20:41:21.885129097Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:41:21.885254 env[1563]: time="2024-02-09T20:41:21.885133846Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:41:21.911817 env[1563]: time="2024-02-09T20:41:21.911778388Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:21.912007 env[1563]: time="2024-02-09T20:41:21.911803659Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:21.912067 kubelet[2767]: E0209 20:41:21.912037 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:41:21.912228 kubelet[2767]: E0209 20:41:21.912077 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:41:21.912228 kubelet[2767]: E0209 20:41:21.912104 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:21.912228 kubelet[2767]: E0209 20:41:21.912036 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:41:21.912228 kubelet[2767]: E0209 20:41:21.912125 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:41:21.912228 kubelet[2767]: E0209 20:41:21.912132 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:41:21.912415 kubelet[2767]: E0209 20:41:21.912152 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:21.912415 kubelet[2767]: E0209 20:41:21.912168 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:41:23.193562 systemd[1]: Started sshd@44-86.109.11.101:22-139.178.89.65:58330.service. Feb 9 20:41:23.193000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-86.109.11.101:22-139.178.89.65:58330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:23.228995 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:23.229090 kernel: audit: type=1130 audit(1707511283.193:600): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-86.109.11.101:22-139.178.89.65:58330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:23.346543 sshd[9520]: Accepted publickey for core from 139.178.89.65 port 58330 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:23.346000 audit[9520]: USER_ACCT pid=9520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.347630 sshd[9520]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:23.349937 systemd-logind[1548]: New session 46 of user core. Feb 9 20:41:23.350373 systemd[1]: Started session-46.scope. Feb 9 20:41:23.431621 sshd[9520]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:23.433035 systemd[1]: sshd@44-86.109.11.101:22-139.178.89.65:58330.service: Deactivated successfully. Feb 9 20:41:23.433675 systemd[1]: session-46.scope: Deactivated successfully. Feb 9 20:41:23.433703 systemd-logind[1548]: Session 46 logged out. Waiting for processes to exit. Feb 9 20:41:23.434159 systemd-logind[1548]: Removed session 46. Feb 9 20:41:23.347000 audit[9520]: CRED_ACQ pid=9520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.528015 kernel: audit: type=1101 audit(1707511283.346:601): pid=9520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.528055 kernel: audit: type=1103 audit(1707511283.347:602): pid=9520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.528071 kernel: audit: type=1006 audit(1707511283.347:603): pid=9520 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=46 res=1 Feb 9 20:41:23.347000 audit[9520]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca38739f0 a2=3 a3=0 items=0 ppid=1 pid=9520 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:23.678426 kernel: audit: type=1300 audit(1707511283.347:603): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca38739f0 a2=3 a3=0 items=0 ppid=1 pid=9520 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:23.678531 kernel: audit: type=1327 audit(1707511283.347:603): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:23.347000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:23.351000 audit[9520]: USER_START pid=9520 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.803009 kernel: audit: type=1105 audit(1707511283.351:604): pid=9520 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.803075 kernel: audit: type=1103 audit(1707511283.352:605): pid=9523 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.352000 audit[9523]: CRED_ACQ pid=9523 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.431000 audit[9520]: USER_END pid=9520 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.987297 kernel: audit: type=1106 audit(1707511283.431:606): pid=9520 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.987365 kernel: audit: type=1104 audit(1707511283.432:607): pid=9520 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.432000 audit[9520]: CRED_DISP pid=9520 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:23.432000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-86.109.11.101:22-139.178.89.65:58330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:28.439164 systemd[1]: Started sshd@45-86.109.11.101:22-139.178.89.65:57206.service. Feb 9 20:41:28.438000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-86.109.11.101:22-139.178.89.65:57206 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:28.475700 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:28.475804 kernel: audit: type=1130 audit(1707511288.438:609): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-86.109.11.101:22-139.178.89.65:57206 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:28.591000 audit[9546]: USER_ACCT pid=9546 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.593373 sshd[9546]: Accepted publickey for core from 139.178.89.65 port 57206 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:28.595109 sshd[9546]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:28.597439 systemd-logind[1548]: New session 47 of user core. Feb 9 20:41:28.597925 systemd[1]: Started session-47.scope. Feb 9 20:41:28.675779 sshd[9546]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:28.677076 systemd[1]: sshd@45-86.109.11.101:22-139.178.89.65:57206.service: Deactivated successfully. Feb 9 20:41:28.677757 systemd[1]: session-47.scope: Deactivated successfully. Feb 9 20:41:28.677774 systemd-logind[1548]: Session 47 logged out. Waiting for processes to exit. Feb 9 20:41:28.678236 systemd-logind[1548]: Removed session 47. Feb 9 20:41:28.593000 audit[9546]: CRED_ACQ pid=9546 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.774718 kernel: audit: type=1101 audit(1707511288.591:610): pid=9546 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.774760 kernel: audit: type=1103 audit(1707511288.593:611): pid=9546 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.774776 kernel: audit: type=1006 audit(1707511288.593:612): pid=9546 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=47 res=1 Feb 9 20:41:28.833243 kernel: audit: type=1300 audit(1707511288.593:612): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff59516cf0 a2=3 a3=0 items=0 ppid=1 pid=9546 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:28.593000 audit[9546]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff59516cf0 a2=3 a3=0 items=0 ppid=1 pid=9546 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:28.925262 kernel: audit: type=1327 audit(1707511288.593:612): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:28.593000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:28.598000 audit[9546]: USER_START pid=9546 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.956397 kernel: audit: type=1105 audit(1707511288.598:613): pid=9546 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.598000 audit[9549]: CRED_ACQ pid=9549 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:29.050414 kernel: audit: type=1103 audit(1707511288.598:614): pid=9549 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.675000 audit[9546]: USER_END pid=9546 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:29.234129 kernel: audit: type=1106 audit(1707511288.675:615): pid=9546 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:29.234191 kernel: audit: type=1104 audit(1707511288.675:616): pid=9546 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.675000 audit[9546]: CRED_DISP pid=9546 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:28.675000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-86.109.11.101:22-139.178.89.65:57206 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:29.885541 env[1563]: time="2024-02-09T20:41:29.885417227Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:41:29.910528 env[1563]: time="2024-02-09T20:41:29.910448959Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:29.910681 kubelet[2767]: E0209 20:41:29.910630 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:41:29.910681 kubelet[2767]: E0209 20:41:29.910658 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:41:29.910681 kubelet[2767]: E0209 20:41:29.910679 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:29.910927 kubelet[2767]: E0209 20:41:29.910698 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:41:32.886336 env[1563]: time="2024-02-09T20:41:32.886200491Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:41:32.886336 env[1563]: time="2024-02-09T20:41:32.886220505Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:41:32.931795 env[1563]: time="2024-02-09T20:41:32.931748682Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:32.931947 kubelet[2767]: E0209 20:41:32.931934 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:41:32.932133 kubelet[2767]: E0209 20:41:32.931965 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:41:32.932133 kubelet[2767]: E0209 20:41:32.931988 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:32.932133 kubelet[2767]: E0209 20:41:32.932008 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:41:32.932235 env[1563]: time="2024-02-09T20:41:32.932199926Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:32.932275 kubelet[2767]: E0209 20:41:32.932269 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:41:32.932299 kubelet[2767]: E0209 20:41:32.932281 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:41:32.932321 kubelet[2767]: E0209 20:41:32.932298 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:32.932321 kubelet[2767]: E0209 20:41:32.932315 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:41:33.681977 systemd[1]: Started sshd@46-86.109.11.101:22-139.178.89.65:57214.service. Feb 9 20:41:33.680000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-86.109.11.101:22-139.178.89.65:57214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:33.708666 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:33.708791 kernel: audit: type=1130 audit(1707511293.680:618): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-86.109.11.101:22-139.178.89.65:57214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:33.829135 sshd[9666]: Accepted publickey for core from 139.178.89.65 port 57214 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:33.827000 audit[9666]: USER_ACCT pid=9666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:33.829913 sshd[9666]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:33.832217 systemd-logind[1548]: New session 48 of user core. Feb 9 20:41:33.832716 systemd[1]: Started session-48.scope. Feb 9 20:41:33.884284 env[1563]: time="2024-02-09T20:41:33.884244440Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:41:33.897815 env[1563]: time="2024-02-09T20:41:33.897779022Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:33.898081 kubelet[2767]: E0209 20:41:33.898008 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:41:33.898081 kubelet[2767]: E0209 20:41:33.898045 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:41:33.898151 kubelet[2767]: E0209 20:41:33.898082 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:33.898151 kubelet[2767]: E0209 20:41:33.898110 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:41:33.912096 sshd[9666]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:33.913800 systemd[1]: sshd@46-86.109.11.101:22-139.178.89.65:57214.service: Deactivated successfully. Feb 9 20:41:33.914593 systemd[1]: session-48.scope: Deactivated successfully. Feb 9 20:41:33.914611 systemd-logind[1548]: Session 48 logged out. Waiting for processes to exit. Feb 9 20:41:33.915177 systemd-logind[1548]: Removed session 48. Feb 9 20:41:33.828000 audit[9666]: CRED_ACQ pid=9666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:33.923344 kernel: audit: type=1101 audit(1707511293.827:619): pid=9666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:33.923387 kernel: audit: type=1103 audit(1707511293.828:620): pid=9666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:34.071249 kernel: audit: type=1006 audit(1707511293.828:621): pid=9666 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=48 res=1 Feb 9 20:41:34.071285 kernel: audit: type=1300 audit(1707511293.828:621): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd1e793510 a2=3 a3=0 items=0 ppid=1 pid=9666 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:33.828000 audit[9666]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd1e793510 a2=3 a3=0 items=0 ppid=1 pid=9666 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:34.163142 kernel: audit: type=1327 audit(1707511293.828:621): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:33.828000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:34.193525 kernel: audit: type=1105 audit(1707511293.833:622): pid=9666 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:33.833000 audit[9666]: USER_START pid=9666 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:34.287647 kernel: audit: type=1103 audit(1707511293.834:623): pid=9669 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:33.834000 audit[9669]: CRED_ACQ pid=9669 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:34.376540 kernel: audit: type=1106 audit(1707511293.911:624): pid=9666 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:33.911000 audit[9666]: USER_END pid=9666 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:33.911000 audit[9666]: CRED_DISP pid=9666 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:34.560799 kernel: audit: type=1104 audit(1707511293.911:625): pid=9666 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:33.912000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-86.109.11.101:22-139.178.89.65:57214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:38.918715 systemd[1]: Started sshd@47-86.109.11.101:22-139.178.89.65:54146.service. Feb 9 20:41:38.917000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-86.109.11.101:22-139.178.89.65:54146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:38.945797 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:38.945869 kernel: audit: type=1130 audit(1707511298.917:627): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-86.109.11.101:22-139.178.89.65:54146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:39.063000 audit[9722]: USER_ACCT pid=9722 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.065377 sshd[9722]: Accepted publickey for core from 139.178.89.65 port 54146 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:39.066625 sshd[9722]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:39.069033 systemd-logind[1548]: New session 49 of user core. Feb 9 20:41:39.069436 systemd[1]: Started session-49.scope. Feb 9 20:41:39.146680 sshd[9722]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:39.148098 systemd[1]: sshd@47-86.109.11.101:22-139.178.89.65:54146.service: Deactivated successfully. Feb 9 20:41:39.148729 systemd[1]: session-49.scope: Deactivated successfully. Feb 9 20:41:39.148738 systemd-logind[1548]: Session 49 logged out. Waiting for processes to exit. Feb 9 20:41:39.149213 systemd-logind[1548]: Removed session 49. Feb 9 20:41:39.065000 audit[9722]: CRED_ACQ pid=9722 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.246639 kernel: audit: type=1101 audit(1707511299.063:628): pid=9722 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.246676 kernel: audit: type=1103 audit(1707511299.065:629): pid=9722 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.246691 kernel: audit: type=1006 audit(1707511299.065:630): pid=9722 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=49 res=1 Feb 9 20:41:39.305083 kernel: audit: type=1300 audit(1707511299.065:630): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff1e51c260 a2=3 a3=0 items=0 ppid=1 pid=9722 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:39.065000 audit[9722]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff1e51c260 a2=3 a3=0 items=0 ppid=1 pid=9722 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:39.396995 kernel: audit: type=1327 audit(1707511299.065:630): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:39.065000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:39.427439 kernel: audit: type=1105 audit(1707511299.070:631): pid=9722 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.070000 audit[9722]: USER_START pid=9722 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.521594 kernel: audit: type=1103 audit(1707511299.070:632): pid=9725 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.070000 audit[9725]: CRED_ACQ pid=9725 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.610572 kernel: audit: type=1106 audit(1707511299.146:633): pid=9722 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.146000 audit[9722]: USER_END pid=9722 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.705863 kernel: audit: type=1104 audit(1707511299.146:634): pid=9722 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.146000 audit[9722]: CRED_DISP pid=9722 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:39.146000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-86.109.11.101:22-139.178.89.65:54146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:41.884848 env[1563]: time="2024-02-09T20:41:41.884744088Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:41:41.912323 env[1563]: time="2024-02-09T20:41:41.912288775Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:41.912530 kubelet[2767]: E0209 20:41:41.912488 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:41:41.912530 kubelet[2767]: E0209 20:41:41.912515 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:41:41.912731 kubelet[2767]: E0209 20:41:41.912536 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:41.912731 kubelet[2767]: E0209 20:41:41.912554 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:41:44.153258 systemd[1]: Started sshd@48-86.109.11.101:22-139.178.89.65:54152.service. Feb 9 20:41:44.152000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-86.109.11.101:22-139.178.89.65:54152 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:44.180323 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:44.180358 kernel: audit: type=1130 audit(1707511304.152:636): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-86.109.11.101:22-139.178.89.65:54152 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:44.298000 audit[9778]: USER_ACCT pid=9778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.299650 sshd[9778]: Accepted publickey for core from 139.178.89.65 port 54152 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:44.301635 sshd[9778]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:44.304095 systemd-logind[1548]: New session 50 of user core. Feb 9 20:41:44.304696 systemd[1]: Started session-50.scope. Feb 9 20:41:44.384221 sshd[9778]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:44.385852 systemd[1]: sshd@48-86.109.11.101:22-139.178.89.65:54152.service: Deactivated successfully. Feb 9 20:41:44.386626 systemd[1]: session-50.scope: Deactivated successfully. Feb 9 20:41:44.386684 systemd-logind[1548]: Session 50 logged out. Waiting for processes to exit. Feb 9 20:41:44.387197 systemd-logind[1548]: Removed session 50. Feb 9 20:41:44.300000 audit[9778]: CRED_ACQ pid=9778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.483542 kernel: audit: type=1101 audit(1707511304.298:637): pid=9778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.483594 kernel: audit: type=1103 audit(1707511304.300:638): pid=9778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.483611 kernel: audit: type=1006 audit(1707511304.300:639): pid=9778 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=50 res=1 Feb 9 20:41:44.542046 kernel: audit: type=1300 audit(1707511304.300:639): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce5aaa860 a2=3 a3=0 items=0 ppid=1 pid=9778 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:44.300000 audit[9778]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce5aaa860 a2=3 a3=0 items=0 ppid=1 pid=9778 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:44.634004 kernel: audit: type=1327 audit(1707511304.300:639): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:44.300000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:44.664421 kernel: audit: type=1105 audit(1707511304.305:640): pid=9778 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.305000 audit[9778]: USER_START pid=9778 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.758582 kernel: audit: type=1103 audit(1707511304.305:641): pid=9781 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.305000 audit[9781]: CRED_ACQ pid=9781 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.847637 kernel: audit: type=1106 audit(1707511304.383:642): pid=9778 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.383000 audit[9778]: USER_END pid=9778 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.942893 kernel: audit: type=1104 audit(1707511304.383:643): pid=9778 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.383000 audit[9778]: CRED_DISP pid=9778 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:44.384000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-86.109.11.101:22-139.178.89.65:54152 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:45.885650 env[1563]: time="2024-02-09T20:41:45.885537218Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:41:45.937923 env[1563]: time="2024-02-09T20:41:45.937886555Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:45.938105 kubelet[2767]: E0209 20:41:45.938069 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:41:45.938105 kubelet[2767]: E0209 20:41:45.938095 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:41:45.938301 kubelet[2767]: E0209 20:41:45.938118 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:45.938301 kubelet[2767]: E0209 20:41:45.938136 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:41:47.885135 env[1563]: time="2024-02-09T20:41:47.884986350Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:41:47.910935 env[1563]: time="2024-02-09T20:41:47.910898174Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:47.911092 kubelet[2767]: E0209 20:41:47.911080 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:41:47.911293 kubelet[2767]: E0209 20:41:47.911111 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:41:47.911293 kubelet[2767]: E0209 20:41:47.911143 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:47.911293 kubelet[2767]: E0209 20:41:47.911175 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:41:48.885159 env[1563]: time="2024-02-09T20:41:48.885011491Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:41:48.936877 env[1563]: time="2024-02-09T20:41:48.936780639Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:48.937064 kubelet[2767]: E0209 20:41:48.937026 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:41:48.937401 kubelet[2767]: E0209 20:41:48.937069 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:41:48.937401 kubelet[2767]: E0209 20:41:48.937116 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:48.937401 kubelet[2767]: E0209 20:41:48.937151 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:41:49.390291 systemd[1]: Started sshd@49-86.109.11.101:22-139.178.89.65:43262.service. Feb 9 20:41:49.390000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-86.109.11.101:22-139.178.89.65:43262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:49.417456 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:49.417539 kernel: audit: type=1130 audit(1707511309.390:645): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-86.109.11.101:22-139.178.89.65:43262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:49.537000 audit[9894]: USER_ACCT pid=9894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.537941 sshd[9894]: Accepted publickey for core from 139.178.89.65 port 43262 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:49.539625 sshd[9894]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:49.542241 systemd-logind[1548]: New session 51 of user core. Feb 9 20:41:49.542680 systemd[1]: Started session-51.scope. Feb 9 20:41:49.621270 sshd[9894]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:49.622667 systemd[1]: sshd@49-86.109.11.101:22-139.178.89.65:43262.service: Deactivated successfully. Feb 9 20:41:49.623268 systemd-logind[1548]: Session 51 logged out. Waiting for processes to exit. Feb 9 20:41:49.623277 systemd[1]: session-51.scope: Deactivated successfully. Feb 9 20:41:49.624082 systemd-logind[1548]: Removed session 51. Feb 9 20:41:49.539000 audit[9894]: CRED_ACQ pid=9894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.719215 kernel: audit: type=1101 audit(1707511309.537:646): pid=9894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.719256 kernel: audit: type=1103 audit(1707511309.539:647): pid=9894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.719273 kernel: audit: type=1006 audit(1707511309.539:648): pid=9894 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=51 res=1 Feb 9 20:41:49.777675 kernel: audit: type=1300 audit(1707511309.539:648): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc63e8e360 a2=3 a3=0 items=0 ppid=1 pid=9894 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:49.539000 audit[9894]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc63e8e360 a2=3 a3=0 items=0 ppid=1 pid=9894 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:49.869573 kernel: audit: type=1327 audit(1707511309.539:648): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:49.539000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:49.899992 kernel: audit: type=1105 audit(1707511309.543:649): pid=9894 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.543000 audit[9894]: USER_START pid=9894 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.994106 kernel: audit: type=1103 audit(1707511309.543:650): pid=9897 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.543000 audit[9897]: CRED_ACQ pid=9897 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:50.083023 kernel: audit: type=1106 audit(1707511309.617:651): pid=9894 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.617000 audit[9894]: USER_END pid=9894 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.617000 audit[9894]: CRED_DISP pid=9894 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:50.267346 kernel: audit: type=1104 audit(1707511309.617:652): pid=9894 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:49.622000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-86.109.11.101:22-139.178.89.65:43262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:54.628960 systemd[1]: Started sshd@50-86.109.11.101:22-139.178.89.65:43268.service. Feb 9 20:41:54.629000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-86.109.11.101:22-139.178.89.65:43268 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:54.671475 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:54.671572 kernel: audit: type=1130 audit(1707511314.629:654): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-86.109.11.101:22-139.178.89.65:43268 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:54.789000 audit[9920]: USER_ACCT pid=9920 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:54.790045 sshd[9920]: Accepted publickey for core from 139.178.89.65 port 43268 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:41:54.791291 sshd[9920]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:41:54.793863 systemd-logind[1548]: New session 52 of user core. Feb 9 20:41:54.794300 systemd[1]: Started session-52.scope. Feb 9 20:41:54.876991 sshd[9920]: pam_unix(sshd:session): session closed for user core Feb 9 20:41:54.878433 systemd[1]: sshd@50-86.109.11.101:22-139.178.89.65:43268.service: Deactivated successfully. Feb 9 20:41:54.879081 systemd-logind[1548]: Session 52 logged out. Waiting for processes to exit. Feb 9 20:41:54.879112 systemd[1]: session-52.scope: Deactivated successfully. Feb 9 20:41:54.879643 systemd-logind[1548]: Removed session 52. Feb 9 20:41:54.790000 audit[9920]: CRED_ACQ pid=9920 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:54.971537 kernel: audit: type=1101 audit(1707511314.789:655): pid=9920 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:54.971577 kernel: audit: type=1103 audit(1707511314.790:656): pid=9920 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:54.971593 kernel: audit: type=1006 audit(1707511314.790:657): pid=9920 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=52 res=1 Feb 9 20:41:55.029956 kernel: audit: type=1300 audit(1707511314.790:657): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0171b030 a2=3 a3=0 items=0 ppid=1 pid=9920 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:54.790000 audit[9920]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0171b030 a2=3 a3=0 items=0 ppid=1 pid=9920 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:41:55.121889 kernel: audit: type=1327 audit(1707511314.790:657): proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:54.790000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:41:54.796000 audit[9920]: USER_START pid=9920 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:55.246501 kernel: audit: type=1105 audit(1707511314.796:658): pid=9920 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:55.246561 kernel: audit: type=1103 audit(1707511314.796:659): pid=9923 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:54.796000 audit[9923]: CRED_ACQ pid=9923 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:54.877000 audit[9920]: USER_END pid=9920 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:55.430720 kernel: audit: type=1106 audit(1707511314.877:660): pid=9920 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:55.430758 kernel: audit: type=1104 audit(1707511314.877:661): pid=9920 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:54.877000 audit[9920]: CRED_DISP pid=9920 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:41:54.878000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-86.109.11.101:22-139.178.89.65:43268 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:56.885417 env[1563]: time="2024-02-09T20:41:56.885361128Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:41:56.885417 env[1563]: time="2024-02-09T20:41:56.885361821Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:41:56.900003 env[1563]: time="2024-02-09T20:41:56.899965757Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:56.900003 env[1563]: time="2024-02-09T20:41:56.899979221Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:56.900138 kubelet[2767]: E0209 20:41:56.900123 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:41:56.900315 kubelet[2767]: E0209 20:41:56.900133 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:41:56.900315 kubelet[2767]: E0209 20:41:56.900155 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:41:56.900315 kubelet[2767]: E0209 20:41:56.900157 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:41:56.900315 kubelet[2767]: E0209 20:41:56.900178 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:56.900315 kubelet[2767]: E0209 20:41:56.900178 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:56.900462 kubelet[2767]: E0209 20:41:56.900195 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:41:56.900462 kubelet[2767]: E0209 20:41:56.900202 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:41:58.883975 env[1563]: time="2024-02-09T20:41:58.883894057Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:41:58.899622 env[1563]: time="2024-02-09T20:41:58.899547200Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:58.899763 kubelet[2767]: E0209 20:41:58.899745 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:41:58.899999 kubelet[2767]: E0209 20:41:58.899785 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:41:58.899999 kubelet[2767]: E0209 20:41:58.899831 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:58.899999 kubelet[2767]: E0209 20:41:58.899870 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:41:59.883904 env[1563]: time="2024-02-09T20:41:59.883861156Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:41:59.883978 systemd[1]: Started sshd@51-86.109.11.101:22-139.178.89.65:45884.service. Feb 9 20:41:59.882000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-86.109.11.101:22-139.178.89.65:45884 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:41:59.896810 env[1563]: time="2024-02-09T20:41:59.896749353Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:41:59.897157 kubelet[2767]: E0209 20:41:59.897023 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:41:59.897157 kubelet[2767]: E0209 20:41:59.897052 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:41:59.897157 kubelet[2767]: E0209 20:41:59.897072 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:41:59.897157 kubelet[2767]: E0209 20:41:59.897089 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:41:59.911192 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:41:59.911231 kernel: audit: type=1130 audit(1707511319.882:663): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-86.109.11.101:22-139.178.89.65:45884 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:00.036000 audit[10030]: USER_ACCT pid=10030 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.037672 sshd[10030]: Accepted publickey for core from 139.178.89.65 port 45884 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:00.039636 sshd[10030]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:00.042117 systemd-logind[1548]: New session 53 of user core. Feb 9 20:42:00.042841 systemd[1]: Started session-53.scope. Feb 9 20:42:00.119639 sshd[10030]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:00.120996 systemd[1]: sshd@51-86.109.11.101:22-139.178.89.65:45884.service: Deactivated successfully. Feb 9 20:42:00.121622 systemd-logind[1548]: Session 53 logged out. Waiting for processes to exit. Feb 9 20:42:00.121682 systemd[1]: session-53.scope: Deactivated successfully. Feb 9 20:42:00.122117 systemd-logind[1548]: Removed session 53. Feb 9 20:42:00.038000 audit[10030]: CRED_ACQ pid=10030 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.219190 kernel: audit: type=1101 audit(1707511320.036:664): pid=10030 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.219224 kernel: audit: type=1103 audit(1707511320.038:665): pid=10030 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.219240 kernel: audit: type=1006 audit(1707511320.038:666): pid=10030 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=53 res=1 Feb 9 20:42:00.277752 kernel: audit: type=1300 audit(1707511320.038:666): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa7acba40 a2=3 a3=0 items=0 ppid=1 pid=10030 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:00.038000 audit[10030]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa7acba40 a2=3 a3=0 items=0 ppid=1 pid=10030 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:00.369739 kernel: audit: type=1327 audit(1707511320.038:666): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:00.038000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:00.400118 kernel: audit: type=1105 audit(1707511320.043:667): pid=10030 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.043000 audit[10030]: USER_START pid=10030 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.494352 kernel: audit: type=1103 audit(1707511320.044:668): pid=10062 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.044000 audit[10062]: CRED_ACQ pid=10062 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.583403 kernel: audit: type=1106 audit(1707511320.118:669): pid=10030 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.118000 audit[10030]: USER_END pid=10030 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.678661 kernel: audit: type=1104 audit(1707511320.118:670): pid=10030 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.118000 audit[10030]: CRED_DISP pid=10030 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:00.119000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-86.109.11.101:22-139.178.89.65:45884 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:05.126341 systemd[1]: Started sshd@52-86.109.11.101:22-139.178.89.65:45896.service. Feb 9 20:42:05.125000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-86.109.11.101:22-139.178.89.65:45896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:05.153532 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:05.153582 kernel: audit: type=1130 audit(1707511325.125:672): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-86.109.11.101:22-139.178.89.65:45896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:05.271000 audit[10087]: USER_ACCT pid=10087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.273623 sshd[10087]: Accepted publickey for core from 139.178.89.65 port 45896 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:05.277592 sshd[10087]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:05.287263 systemd-logind[1548]: New session 54 of user core. Feb 9 20:42:05.290684 systemd[1]: Started session-54.scope. Feb 9 20:42:05.275000 audit[10087]: CRED_ACQ pid=10087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.370980 sshd[10087]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:05.372444 systemd[1]: sshd@52-86.109.11.101:22-139.178.89.65:45896.service: Deactivated successfully. Feb 9 20:42:05.373066 systemd[1]: session-54.scope: Deactivated successfully. Feb 9 20:42:05.373091 systemd-logind[1548]: Session 54 logged out. Waiting for processes to exit. Feb 9 20:42:05.373545 systemd-logind[1548]: Removed session 54. Feb 9 20:42:05.454770 kernel: audit: type=1101 audit(1707511325.271:673): pid=10087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.454807 kernel: audit: type=1103 audit(1707511325.275:674): pid=10087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.454826 kernel: audit: type=1006 audit(1707511325.275:675): pid=10087 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=54 res=1 Feb 9 20:42:05.513289 kernel: audit: type=1300 audit(1707511325.275:675): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc550d0bf0 a2=3 a3=0 items=0 ppid=1 pid=10087 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:05.275000 audit[10087]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc550d0bf0 a2=3 a3=0 items=0 ppid=1 pid=10087 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:05.605165 kernel: audit: type=1327 audit(1707511325.275:675): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:05.275000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:05.635512 kernel: audit: type=1105 audit(1707511325.295:676): pid=10087 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.295000 audit[10087]: USER_START pid=10087 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.729572 kernel: audit: type=1103 audit(1707511325.296:677): pid=10090 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.296000 audit[10090]: CRED_ACQ pid=10090 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.818422 kernel: audit: type=1106 audit(1707511325.370:678): pid=10087 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.370000 audit[10087]: USER_END pid=10087 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.913629 kernel: audit: type=1104 audit(1707511325.370:679): pid=10087 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.370000 audit[10087]: CRED_DISP pid=10087 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:05.371000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-86.109.11.101:22-139.178.89.65:45896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:07.885371 env[1563]: time="2024-02-09T20:42:07.885255055Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:42:07.914534 env[1563]: time="2024-02-09T20:42:07.914479046Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:07.914732 kubelet[2767]: E0209 20:42:07.914720 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:42:07.914909 kubelet[2767]: E0209 20:42:07.914747 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:42:07.914909 kubelet[2767]: E0209 20:42:07.914770 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:07.914909 kubelet[2767]: E0209 20:42:07.914788 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:42:08.885387 env[1563]: time="2024-02-09T20:42:08.885269251Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:42:08.915286 env[1563]: time="2024-02-09T20:42:08.915228996Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:08.915485 kubelet[2767]: E0209 20:42:08.915441 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:42:08.915485 kubelet[2767]: E0209 20:42:08.915467 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:42:08.915682 kubelet[2767]: E0209 20:42:08.915488 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:08.915682 kubelet[2767]: E0209 20:42:08.915506 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:42:10.378092 systemd[1]: Started sshd@53-86.109.11.101:22-139.178.89.65:35984.service. Feb 9 20:42:10.376000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-86.109.11.101:22-139.178.89.65:35984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:10.405009 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:10.405101 kernel: audit: type=1130 audit(1707511330.376:681): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-86.109.11.101:22-139.178.89.65:35984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:10.523000 audit[10171]: USER_ACCT pid=10171 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.524853 sshd[10171]: Accepted publickey for core from 139.178.89.65 port 35984 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:10.525950 sshd[10171]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:10.528151 systemd-logind[1548]: New session 55 of user core. Feb 9 20:42:10.528626 systemd[1]: Started session-55.scope. Feb 9 20:42:10.607043 sshd[10171]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:10.608450 systemd[1]: sshd@53-86.109.11.101:22-139.178.89.65:35984.service: Deactivated successfully. Feb 9 20:42:10.609101 systemd-logind[1548]: Session 55 logged out. Waiting for processes to exit. Feb 9 20:42:10.609134 systemd[1]: session-55.scope: Deactivated successfully. Feb 9 20:42:10.609626 systemd-logind[1548]: Removed session 55. Feb 9 20:42:10.524000 audit[10171]: CRED_ACQ pid=10171 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.706082 kernel: audit: type=1101 audit(1707511330.523:682): pid=10171 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.706127 kernel: audit: type=1103 audit(1707511330.524:683): pid=10171 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.706143 kernel: audit: type=1006 audit(1707511330.524:684): pid=10171 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=55 res=1 Feb 9 20:42:10.764506 kernel: audit: type=1300 audit(1707511330.524:684): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdd0f8d170 a2=3 a3=0 items=0 ppid=1 pid=10171 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:10.524000 audit[10171]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdd0f8d170 a2=3 a3=0 items=0 ppid=1 pid=10171 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:10.856314 kernel: audit: type=1327 audit(1707511330.524:684): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:10.524000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:10.883888 env[1563]: time="2024-02-09T20:42:10.883870689Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:42:10.529000 audit[10171]: USER_START pid=10171 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.896157 env[1563]: time="2024-02-09T20:42:10.896095212Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:10.896280 kubelet[2767]: E0209 20:42:10.896269 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:42:10.896465 kubelet[2767]: E0209 20:42:10.896297 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:42:10.896465 kubelet[2767]: E0209 20:42:10.896319 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:10.896465 kubelet[2767]: E0209 20:42:10.896346 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:42:10.980803 kernel: audit: type=1105 audit(1707511330.529:685): pid=10171 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.980839 kernel: audit: type=1103 audit(1707511330.530:686): pid=10174 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.530000 audit[10174]: CRED_ACQ pid=10174 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:11.069666 kernel: audit: type=1106 audit(1707511330.606:687): pid=10171 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.606000 audit[10171]: USER_END pid=10171 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:11.164824 kernel: audit: type=1104 audit(1707511330.606:688): pid=10171 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.606000 audit[10171]: CRED_DISP pid=10171 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:10.607000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-86.109.11.101:22-139.178.89.65:35984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:11.885536 env[1563]: time="2024-02-09T20:42:11.885434481Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:42:11.912031 env[1563]: time="2024-02-09T20:42:11.911993702Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:11.912190 kubelet[2767]: E0209 20:42:11.912178 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:42:11.912402 kubelet[2767]: E0209 20:42:11.912207 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:42:11.912402 kubelet[2767]: E0209 20:42:11.912228 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:11.912402 kubelet[2767]: E0209 20:42:11.912245 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:42:15.614208 systemd[1]: Started sshd@54-86.109.11.101:22-139.178.89.65:35998.service. Feb 9 20:42:15.613000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-86.109.11.101:22-139.178.89.65:35998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:15.641371 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:15.641446 kernel: audit: type=1130 audit(1707511335.613:690): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-86.109.11.101:22-139.178.89.65:35998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:15.759000 audit[10252]: USER_ACCT pid=10252 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.761191 sshd[10252]: Accepted publickey for core from 139.178.89.65 port 35998 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:15.762620 sshd[10252]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:15.764946 systemd-logind[1548]: New session 56 of user core. Feb 9 20:42:15.765410 systemd[1]: Started session-56.scope. Feb 9 20:42:15.843616 sshd[10252]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:15.844879 systemd[1]: sshd@54-86.109.11.101:22-139.178.89.65:35998.service: Deactivated successfully. Feb 9 20:42:15.845489 systemd-logind[1548]: Session 56 logged out. Waiting for processes to exit. Feb 9 20:42:15.845524 systemd[1]: session-56.scope: Deactivated successfully. Feb 9 20:42:15.846024 systemd-logind[1548]: Removed session 56. Feb 9 20:42:15.761000 audit[10252]: CRED_ACQ pid=10252 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.943701 kernel: audit: type=1101 audit(1707511335.759:691): pid=10252 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.943737 kernel: audit: type=1103 audit(1707511335.761:692): pid=10252 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.943756 kernel: audit: type=1006 audit(1707511335.761:693): pid=10252 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=56 res=1 Feb 9 20:42:16.002186 kernel: audit: type=1300 audit(1707511335.761:693): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe0d41d8d0 a2=3 a3=0 items=0 ppid=1 pid=10252 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:15.761000 audit[10252]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe0d41d8d0 a2=3 a3=0 items=0 ppid=1 pid=10252 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:16.094039 kernel: audit: type=1327 audit(1707511335.761:693): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:15.761000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:16.124421 kernel: audit: type=1105 audit(1707511335.766:694): pid=10252 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.766000 audit[10252]: USER_START pid=10252 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:16.218621 kernel: audit: type=1103 audit(1707511335.767:695): pid=10255 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.767000 audit[10255]: CRED_ACQ pid=10255 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:16.307544 kernel: audit: type=1106 audit(1707511335.842:696): pid=10252 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.842000 audit[10252]: USER_END pid=10252 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:16.402793 kernel: audit: type=1104 audit(1707511335.842:697): pid=10252 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.842000 audit[10252]: CRED_DISP pid=10252 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:15.843000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-86.109.11.101:22-139.178.89.65:35998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:20.851408 systemd[1]: Started sshd@55-86.109.11.101:22-139.178.89.65:54360.service. Feb 9 20:42:20.850000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-86.109.11.101:22-139.178.89.65:54360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:20.885518 env[1563]: time="2024-02-09T20:42:20.885442380Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:42:20.894841 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:20.894980 kernel: audit: type=1130 audit(1707511340.850:699): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-86.109.11.101:22-139.178.89.65:54360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:20.909558 env[1563]: time="2024-02-09T20:42:20.909509866Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:20.909768 kubelet[2767]: E0209 20:42:20.909724 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:42:20.909768 kubelet[2767]: E0209 20:42:20.909759 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:42:20.910066 kubelet[2767]: E0209 20:42:20.909791 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:20.910066 kubelet[2767]: E0209 20:42:20.909817 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:42:21.019000 audit[10280]: USER_ACCT pid=10280 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.021213 sshd[10280]: Accepted publickey for core from 139.178.89.65 port 54360 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:21.022737 sshd[10280]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:21.025940 systemd-logind[1548]: New session 57 of user core. Feb 9 20:42:21.026492 systemd[1]: Started session-57.scope. Feb 9 20:42:21.104116 sshd[10280]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:21.105512 systemd[1]: sshd@55-86.109.11.101:22-139.178.89.65:54360.service: Deactivated successfully. Feb 9 20:42:21.106159 systemd[1]: session-57.scope: Deactivated successfully. Feb 9 20:42:21.106161 systemd-logind[1548]: Session 57 logged out. Waiting for processes to exit. Feb 9 20:42:21.106771 systemd-logind[1548]: Removed session 57. Feb 9 20:42:21.112347 kernel: audit: type=1101 audit(1707511341.019:700): pid=10280 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.112380 kernel: audit: type=1103 audit(1707511341.021:701): pid=10280 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.021000 audit[10280]: CRED_ACQ pid=10280 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.261146 kernel: audit: type=1006 audit(1707511341.021:702): pid=10280 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=57 res=1 Feb 9 20:42:21.261178 kernel: audit: type=1300 audit(1707511341.021:702): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd386af8c0 a2=3 a3=0 items=0 ppid=1 pid=10280 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:21.021000 audit[10280]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd386af8c0 a2=3 a3=0 items=0 ppid=1 pid=10280 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:21.021000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:21.383476 kernel: audit: type=1327 audit(1707511341.021:702): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:21.383509 kernel: audit: type=1105 audit(1707511341.028:703): pid=10280 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.028000 audit[10280]: USER_START pid=10280 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.477642 kernel: audit: type=1103 audit(1707511341.028:704): pid=10313 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.028000 audit[10313]: CRED_ACQ pid=10313 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.566588 kernel: audit: type=1106 audit(1707511341.103:705): pid=10280 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.103000 audit[10280]: USER_END pid=10280 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.661880 kernel: audit: type=1104 audit(1707511341.103:706): pid=10280 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.103000 audit[10280]: CRED_DISP pid=10280 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:21.104000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-86.109.11.101:22-139.178.89.65:54360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:22.885787 env[1563]: time="2024-02-09T20:42:22.885734192Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:42:22.886102 env[1563]: time="2024-02-09T20:42:22.885734029Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:42:22.900982 env[1563]: time="2024-02-09T20:42:22.900917021Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:22.901126 env[1563]: time="2024-02-09T20:42:22.901021732Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:22.901178 kubelet[2767]: E0209 20:42:22.901114 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:42:22.901178 kubelet[2767]: E0209 20:42:22.901142 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:42:22.901178 kubelet[2767]: E0209 20:42:22.901153 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:42:22.901178 kubelet[2767]: E0209 20:42:22.901164 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:42:22.901498 kubelet[2767]: E0209 20:42:22.901187 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:22.901498 kubelet[2767]: E0209 20:42:22.901192 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:22.901498 kubelet[2767]: E0209 20:42:22.901213 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:42:22.901656 kubelet[2767]: E0209 20:42:22.901213 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:42:25.885076 env[1563]: time="2024-02-09T20:42:25.884966024Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:42:25.911500 env[1563]: time="2024-02-09T20:42:25.911436932Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:25.911608 kubelet[2767]: E0209 20:42:25.911596 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:42:25.911786 kubelet[2767]: E0209 20:42:25.911625 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:42:25.911786 kubelet[2767]: E0209 20:42:25.911649 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:25.911786 kubelet[2767]: E0209 20:42:25.911669 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:42:26.111269 systemd[1]: Started sshd@56-86.109.11.101:22-139.178.89.65:54368.service. Feb 9 20:42:26.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-86.109.11.101:22-139.178.89.65:54368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:26.138360 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:26.138474 kernel: audit: type=1130 audit(1707511346.110:708): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-86.109.11.101:22-139.178.89.65:54368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:26.257000 audit[10424]: USER_ACCT pid=10424 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.259419 sshd[10424]: Accepted publickey for core from 139.178.89.65 port 54368 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:26.260643 sshd[10424]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:26.262970 systemd-logind[1548]: New session 58 of user core. Feb 9 20:42:26.263430 systemd[1]: Started session-58.scope. Feb 9 20:42:26.340039 sshd[10424]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:26.341298 systemd[1]: sshd@56-86.109.11.101:22-139.178.89.65:54368.service: Deactivated successfully. Feb 9 20:42:26.341943 systemd[1]: session-58.scope: Deactivated successfully. Feb 9 20:42:26.341957 systemd-logind[1548]: Session 58 logged out. Waiting for processes to exit. Feb 9 20:42:26.342407 systemd-logind[1548]: Removed session 58. Feb 9 20:42:26.259000 audit[10424]: CRED_ACQ pid=10424 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.440717 kernel: audit: type=1101 audit(1707511346.257:709): pid=10424 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.440787 kernel: audit: type=1103 audit(1707511346.259:710): pid=10424 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.440830 kernel: audit: type=1006 audit(1707511346.259:711): pid=10424 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=58 res=1 Feb 9 20:42:26.259000 audit[10424]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd2f900f0 a2=3 a3=0 items=0 ppid=1 pid=10424 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:26.591165 kernel: audit: type=1300 audit(1707511346.259:711): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd2f900f0 a2=3 a3=0 items=0 ppid=1 pid=10424 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:26.591248 kernel: audit: type=1327 audit(1707511346.259:711): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:26.259000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:26.264000 audit[10424]: USER_START pid=10424 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.715913 kernel: audit: type=1105 audit(1707511346.264:712): pid=10424 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.264000 audit[10427]: CRED_ACQ pid=10427 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.804886 kernel: audit: type=1103 audit(1707511346.264:713): pid=10427 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.804943 kernel: audit: type=1106 audit(1707511346.339:714): pid=10424 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.339000 audit[10424]: USER_END pid=10424 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.339000 audit[10424]: CRED_DISP pid=10424 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.989250 kernel: audit: type=1104 audit(1707511346.339:715): pid=10424 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:26.340000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-86.109.11.101:22-139.178.89.65:54368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:31.346628 systemd[1]: Started sshd@57-86.109.11.101:22-139.178.89.65:42586.service. Feb 9 20:42:31.345000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-86.109.11.101:22-139.178.89.65:42586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:31.373309 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:31.373405 kernel: audit: type=1130 audit(1707511351.345:717): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-86.109.11.101:22-139.178.89.65:42586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:31.403445 sshd[10450]: Accepted publickey for core from 139.178.89.65 port 42586 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:31.404599 sshd[10450]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:31.407281 systemd-logind[1548]: New session 59 of user core. Feb 9 20:42:31.407837 systemd[1]: Started session-59.scope. Feb 9 20:42:31.402000 audit[10450]: USER_ACCT pid=10450 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.486085 sshd[10450]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:31.487528 systemd[1]: sshd@57-86.109.11.101:22-139.178.89.65:42586.service: Deactivated successfully. Feb 9 20:42:31.488150 systemd[1]: session-59.scope: Deactivated successfully. Feb 9 20:42:31.488182 systemd-logind[1548]: Session 59 logged out. Waiting for processes to exit. Feb 9 20:42:31.488827 systemd-logind[1548]: Removed session 59. Feb 9 20:42:31.556083 kernel: audit: type=1101 audit(1707511351.402:718): pid=10450 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.556127 kernel: audit: type=1103 audit(1707511351.402:719): pid=10450 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.402000 audit[10450]: CRED_ACQ pid=10450 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.704914 kernel: audit: type=1006 audit(1707511351.404:720): pid=10450 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=59 res=1 Feb 9 20:42:31.704954 kernel: audit: type=1300 audit(1707511351.404:720): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe87400930 a2=3 a3=0 items=0 ppid=1 pid=10450 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:31.404000 audit[10450]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe87400930 a2=3 a3=0 items=0 ppid=1 pid=10450 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:31.796814 kernel: audit: type=1327 audit(1707511351.404:720): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:31.404000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:31.827202 kernel: audit: type=1105 audit(1707511351.408:721): pid=10450 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.408000 audit[10450]: USER_START pid=10450 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.921422 kernel: audit: type=1103 audit(1707511351.409:722): pid=10453 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.409000 audit[10453]: CRED_ACQ pid=10453 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:32.010421 kernel: audit: type=1106 audit(1707511351.485:723): pid=10450 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.485000 audit[10450]: USER_END pid=10450 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:32.105657 kernel: audit: type=1104 audit(1707511351.485:724): pid=10450 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.485000 audit[10450]: CRED_DISP pid=10450 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:31.487000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-86.109.11.101:22-139.178.89.65:42586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:33.885756 env[1563]: time="2024-02-09T20:42:33.885628132Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:42:33.886726 env[1563]: time="2024-02-09T20:42:33.885623791Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:42:33.916280 env[1563]: time="2024-02-09T20:42:33.916242986Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:33.916408 env[1563]: time="2024-02-09T20:42:33.916248354Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:33.916545 kubelet[2767]: E0209 20:42:33.916479 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:42:33.916545 kubelet[2767]: E0209 20:42:33.916540 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:42:33.916760 kubelet[2767]: E0209 20:42:33.916561 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:33.916760 kubelet[2767]: E0209 20:42:33.916579 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:42:33.916760 kubelet[2767]: E0209 20:42:33.916479 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:42:33.916760 kubelet[2767]: E0209 20:42:33.916601 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:42:33.916866 kubelet[2767]: E0209 20:42:33.916641 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:33.916866 kubelet[2767]: E0209 20:42:33.916655 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:42:36.493873 systemd[1]: Started sshd@58-86.109.11.101:22-139.178.89.65:42590.service. Feb 9 20:42:36.493000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-86.109.11.101:22-139.178.89.65:42590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:36.521638 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:36.521728 kernel: audit: type=1130 audit(1707511356.493:726): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-86.109.11.101:22-139.178.89.65:42590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:36.640000 audit[10531]: USER_ACCT pid=10531 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:36.641435 sshd[10531]: Accepted publickey for core from 139.178.89.65 port 42590 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:36.642649 sshd[10531]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:36.645025 systemd-logind[1548]: New session 60 of user core. Feb 9 20:42:36.645444 systemd[1]: Started session-60.scope. Feb 9 20:42:36.722553 sshd[10531]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:36.723916 systemd[1]: sshd@58-86.109.11.101:22-139.178.89.65:42590.service: Deactivated successfully. Feb 9 20:42:36.724590 systemd[1]: session-60.scope: Deactivated successfully. Feb 9 20:42:36.724604 systemd-logind[1548]: Session 60 logged out. Waiting for processes to exit. Feb 9 20:42:36.725145 systemd-logind[1548]: Removed session 60. Feb 9 20:42:36.642000 audit[10531]: CRED_ACQ pid=10531 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:36.822756 kernel: audit: type=1101 audit(1707511356.640:727): pid=10531 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:36.822813 kernel: audit: type=1103 audit(1707511356.642:728): pid=10531 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:36.822852 kernel: audit: type=1006 audit(1707511356.642:729): pid=10531 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=60 res=1 Feb 9 20:42:36.642000 audit[10531]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff0f178c0 a2=3 a3=0 items=0 ppid=1 pid=10531 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:36.883885 env[1563]: time="2024-02-09T20:42:36.883863376Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:42:36.895430 env[1563]: time="2024-02-09T20:42:36.895392876Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:36.895593 kubelet[2767]: E0209 20:42:36.895578 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:42:36.895935 kubelet[2767]: E0209 20:42:36.895605 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:42:36.895935 kubelet[2767]: E0209 20:42:36.895626 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:36.895935 kubelet[2767]: E0209 20:42:36.895645 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:42:36.973085 kernel: audit: type=1300 audit(1707511356.642:729): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff0f178c0 a2=3 a3=0 items=0 ppid=1 pid=10531 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:36.973145 kernel: audit: type=1327 audit(1707511356.642:729): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:36.642000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:36.647000 audit[10531]: USER_START pid=10531 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:37.097612 kernel: audit: type=1105 audit(1707511356.647:730): pid=10531 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:37.097679 kernel: audit: type=1103 audit(1707511356.647:731): pid=10534 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:36.647000 audit[10534]: CRED_ACQ pid=10534 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:36.722000 audit[10531]: USER_END pid=10531 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:37.281801 kernel: audit: type=1106 audit(1707511356.722:732): pid=10531 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:37.281846 kernel: audit: type=1104 audit(1707511356.722:733): pid=10531 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:36.722000 audit[10531]: CRED_DISP pid=10531 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:36.723000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-86.109.11.101:22-139.178.89.65:42590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:38.885801 env[1563]: time="2024-02-09T20:42:38.885668324Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:42:38.914711 env[1563]: time="2024-02-09T20:42:38.914659417Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:38.915008 kubelet[2767]: E0209 20:42:38.914963 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:42:38.915008 kubelet[2767]: E0209 20:42:38.914989 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:42:38.915008 kubelet[2767]: E0209 20:42:38.915010 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:38.915229 kubelet[2767]: E0209 20:42:38.915027 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:42:41.729262 systemd[1]: Started sshd@59-86.109.11.101:22-139.178.89.65:36272.service. Feb 9 20:42:41.729000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-86.109.11.101:22-139.178.89.65:36272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:41.756180 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:41.756239 kernel: audit: type=1130 audit(1707511361.729:735): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-86.109.11.101:22-139.178.89.65:36272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:41.876000 audit[10614]: USER_ACCT pid=10614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:41.876508 sshd[10614]: Accepted publickey for core from 139.178.89.65 port 36272 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:41.877611 sshd[10614]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:41.879982 systemd-logind[1548]: New session 61 of user core. Feb 9 20:42:41.880425 systemd[1]: Started session-61.scope. Feb 9 20:42:41.960652 sshd[10614]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:41.962027 systemd[1]: sshd@59-86.109.11.101:22-139.178.89.65:36272.service: Deactivated successfully. Feb 9 20:42:41.962661 systemd[1]: session-61.scope: Deactivated successfully. Feb 9 20:42:41.962679 systemd-logind[1548]: Session 61 logged out. Waiting for processes to exit. Feb 9 20:42:41.963226 systemd-logind[1548]: Removed session 61. Feb 9 20:42:41.877000 audit[10614]: CRED_ACQ pid=10614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:42.058525 kernel: audit: type=1101 audit(1707511361.876:736): pid=10614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:42.058561 kernel: audit: type=1103 audit(1707511361.877:737): pid=10614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:42.058577 kernel: audit: type=1006 audit(1707511361.877:738): pid=10614 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=61 res=1 Feb 9 20:42:42.117008 kernel: audit: type=1300 audit(1707511361.877:738): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6d274bf0 a2=3 a3=0 items=0 ppid=1 pid=10614 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:41.877000 audit[10614]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6d274bf0 a2=3 a3=0 items=0 ppid=1 pid=10614 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:42.208838 kernel: audit: type=1327 audit(1707511361.877:738): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:41.877000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:42.239206 kernel: audit: type=1105 audit(1707511361.882:739): pid=10614 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:41.882000 audit[10614]: USER_START pid=10614 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:42.333422 kernel: audit: type=1103 audit(1707511361.882:740): pid=10617 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:41.882000 audit[10617]: CRED_ACQ pid=10617 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:42.422421 kernel: audit: type=1106 audit(1707511361.960:741): pid=10614 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:41.960000 audit[10614]: USER_END pid=10614 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:42.517648 kernel: audit: type=1104 audit(1707511361.961:742): pid=10614 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:41.961000 audit[10614]: CRED_DISP pid=10614 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:41.961000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-86.109.11.101:22-139.178.89.65:36272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:44.884937 env[1563]: time="2024-02-09T20:42:44.884807561Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:42:44.938696 env[1563]: time="2024-02-09T20:42:44.938603045Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:44.938941 kubelet[2767]: E0209 20:42:44.938885 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:42:44.938941 kubelet[2767]: E0209 20:42:44.938935 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:42:44.939426 kubelet[2767]: E0209 20:42:44.938988 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:44.939426 kubelet[2767]: E0209 20:42:44.939030 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:42:46.967757 systemd[1]: Started sshd@60-86.109.11.101:22-139.178.89.65:36276.service. Feb 9 20:42:46.967000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-86.109.11.101:22-139.178.89.65:36276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:46.994690 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:46.994855 kernel: audit: type=1130 audit(1707511366.967:744): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-86.109.11.101:22-139.178.89.65:36276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:47.113403 sshd[10672]: Accepted publickey for core from 139.178.89.65 port 36276 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:47.112000 audit[10672]: USER_ACCT pid=10672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.114574 sshd[10672]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:47.116808 systemd-logind[1548]: New session 62 of user core. Feb 9 20:42:47.117297 systemd[1]: Started session-62.scope. Feb 9 20:42:47.195320 sshd[10672]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:47.196775 systemd[1]: sshd@60-86.109.11.101:22-139.178.89.65:36276.service: Deactivated successfully. Feb 9 20:42:47.197388 systemd-logind[1548]: Session 62 logged out. Waiting for processes to exit. Feb 9 20:42:47.197438 systemd[1]: session-62.scope: Deactivated successfully. Feb 9 20:42:47.198070 systemd-logind[1548]: Removed session 62. Feb 9 20:42:47.114000 audit[10672]: CRED_ACQ pid=10672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.296858 kernel: audit: type=1101 audit(1707511367.112:745): pid=10672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.296911 kernel: audit: type=1103 audit(1707511367.114:746): pid=10672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.296948 kernel: audit: type=1006 audit(1707511367.114:747): pid=10672 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=62 res=1 Feb 9 20:42:47.114000 audit[10672]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe46299600 a2=3 a3=0 items=0 ppid=1 pid=10672 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:47.447244 kernel: audit: type=1300 audit(1707511367.114:747): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe46299600 a2=3 a3=0 items=0 ppid=1 pid=10672 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:47.447328 kernel: audit: type=1327 audit(1707511367.114:747): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:47.114000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:47.119000 audit[10672]: USER_START pid=10672 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.571751 kernel: audit: type=1105 audit(1707511367.119:748): pid=10672 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.571790 kernel: audit: type=1103 audit(1707511367.119:749): pid=10675 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.119000 audit[10675]: CRED_ACQ pid=10675 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.195000 audit[10672]: USER_END pid=10672 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.755945 kernel: audit: type=1106 audit(1707511367.195:750): pid=10672 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.756007 kernel: audit: type=1104 audit(1707511367.195:751): pid=10672 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.195000 audit[10672]: CRED_DISP pid=10672 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:47.196000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-86.109.11.101:22-139.178.89.65:36276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:48.884878 env[1563]: time="2024-02-09T20:42:48.884735049Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:42:48.898553 env[1563]: time="2024-02-09T20:42:48.898484835Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:48.898731 kubelet[2767]: E0209 20:42:48.898687 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:42:48.898731 kubelet[2767]: E0209 20:42:48.898718 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:42:48.898982 kubelet[2767]: E0209 20:42:48.898743 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:48.898982 kubelet[2767]: E0209 20:42:48.898766 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:42:51.885073 env[1563]: time="2024-02-09T20:42:51.884981910Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:42:51.911018 env[1563]: time="2024-02-09T20:42:51.910983043Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:51.911188 kubelet[2767]: E0209 20:42:51.911176 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:42:51.911381 kubelet[2767]: E0209 20:42:51.911206 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:42:51.911381 kubelet[2767]: E0209 20:42:51.911228 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:51.911381 kubelet[2767]: E0209 20:42:51.911247 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:42:52.202767 systemd[1]: Started sshd@61-86.109.11.101:22-139.178.89.65:45016.service. Feb 9 20:42:52.202000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-86.109.11.101:22-139.178.89.65:45016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:52.229684 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:52.229819 kernel: audit: type=1130 audit(1707511372.202:753): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-86.109.11.101:22-139.178.89.65:45016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:52.349669 sshd[10751]: Accepted publickey for core from 139.178.89.65 port 45016 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:52.349000 audit[10751]: USER_ACCT pid=10751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.351546 sshd[10751]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:52.353878 systemd-logind[1548]: New session 63 of user core. Feb 9 20:42:52.354386 systemd[1]: Started session-63.scope. Feb 9 20:42:52.350000 audit[10751]: CRED_ACQ pid=10751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.444843 sshd[10751]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:52.446014 systemd[1]: sshd@61-86.109.11.101:22-139.178.89.65:45016.service: Deactivated successfully. Feb 9 20:42:52.446625 systemd[1]: session-63.scope: Deactivated successfully. Feb 9 20:42:52.446656 systemd-logind[1548]: Session 63 logged out. Waiting for processes to exit. Feb 9 20:42:52.447103 systemd-logind[1548]: Removed session 63. Feb 9 20:42:52.532915 kernel: audit: type=1101 audit(1707511372.349:754): pid=10751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.532975 kernel: audit: type=1103 audit(1707511372.350:755): pid=10751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.533019 kernel: audit: type=1006 audit(1707511372.351:756): pid=10751 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=63 res=1 Feb 9 20:42:52.351000 audit[10751]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc9f9d55f0 a2=3 a3=0 items=0 ppid=1 pid=10751 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:52.683258 kernel: audit: type=1300 audit(1707511372.351:756): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc9f9d55f0 a2=3 a3=0 items=0 ppid=1 pid=10751 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:52.683315 kernel: audit: type=1327 audit(1707511372.351:756): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:52.351000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:52.356000 audit[10751]: USER_START pid=10751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.807776 kernel: audit: type=1105 audit(1707511372.356:757): pid=10751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.807838 kernel: audit: type=1103 audit(1707511372.356:758): pid=10754 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.356000 audit[10754]: CRED_ACQ pid=10754 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.884766 env[1563]: time="2024-02-09T20:42:52.884724536Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:42:52.444000 audit[10751]: USER_END pid=10751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.897385 env[1563]: time="2024-02-09T20:42:52.897354907Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:52.897578 kubelet[2767]: E0209 20:42:52.897523 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:42:52.897578 kubelet[2767]: E0209 20:42:52.897551 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:42:52.897578 kubelet[2767]: E0209 20:42:52.897573 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:52.897674 kubelet[2767]: E0209 20:42:52.897591 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:42:52.991959 kernel: audit: type=1106 audit(1707511372.444:759): pid=10751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.991992 kernel: audit: type=1104 audit(1707511372.445:760): pid=10751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.445000 audit[10751]: CRED_DISP pid=10751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:52.445000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-86.109.11.101:22-139.178.89.65:45016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:55.885233 env[1563]: time="2024-02-09T20:42:55.885114723Z" level=info msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\"" Feb 9 20:42:55.941132 env[1563]: time="2024-02-09T20:42:55.940988377Z" level=error msg="StopPodSandbox for \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\" failed" error="failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:42:55.941489 kubelet[2767]: E0209 20:42:55.941445 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4" Feb 9 20:42:55.942168 kubelet[2767]: E0209 20:42:55.941533 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4} Feb 9 20:42:55.942168 kubelet[2767]: E0209 20:42:55.941631 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:42:55.942168 kubelet[2767]: E0209 20:42:55.941702 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c5bd3826-3b40-4247-a328-b65f90095c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"88934948805c56fbb18230c288303c072e5cf8f66d8cd883e5f2dfb9a88074e4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-djg8j" podUID=c5bd3826-3b40-4247-a328-b65f90095c86 Feb 9 20:42:57.452764 systemd[1]: Started sshd@62-86.109.11.101:22-139.178.89.65:45020.service. Feb 9 20:42:57.452000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-86.109.11.101:22-139.178.89.65:45020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:57.480454 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:42:57.480588 kernel: audit: type=1130 audit(1707511377.452:762): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-86.109.11.101:22-139.178.89.65:45020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:42:57.598654 sshd[10837]: Accepted publickey for core from 139.178.89.65 port 45020 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:42:57.598000 audit[10837]: USER_ACCT pid=10837 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.599851 sshd[10837]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:42:57.602088 systemd-logind[1548]: New session 64 of user core. Feb 9 20:42:57.602549 systemd[1]: Started session-64.scope. Feb 9 20:42:57.679289 sshd[10837]: pam_unix(sshd:session): session closed for user core Feb 9 20:42:57.680710 systemd[1]: sshd@62-86.109.11.101:22-139.178.89.65:45020.service: Deactivated successfully. Feb 9 20:42:57.681275 systemd-logind[1548]: Session 64 logged out. Waiting for processes to exit. Feb 9 20:42:57.681309 systemd[1]: session-64.scope: Deactivated successfully. Feb 9 20:42:57.681871 systemd-logind[1548]: Removed session 64. Feb 9 20:42:57.599000 audit[10837]: CRED_ACQ pid=10837 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.780171 kernel: audit: type=1101 audit(1707511377.598:763): pid=10837 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.780214 kernel: audit: type=1103 audit(1707511377.599:764): pid=10837 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.780231 kernel: audit: type=1006 audit(1707511377.599:765): pid=10837 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=64 res=1 Feb 9 20:42:57.838708 kernel: audit: type=1300 audit(1707511377.599:765): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe06085ce0 a2=3 a3=0 items=0 ppid=1 pid=10837 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:57.599000 audit[10837]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe06085ce0 a2=3 a3=0 items=0 ppid=1 pid=10837 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:42:57.930573 kernel: audit: type=1327 audit(1707511377.599:765): proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:57.599000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:42:57.960929 kernel: audit: type=1105 audit(1707511377.604:766): pid=10837 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.604000 audit[10837]: USER_START pid=10837 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:58.055101 kernel: audit: type=1103 audit(1707511377.604:767): pid=10840 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.604000 audit[10840]: CRED_ACQ pid=10840 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:58.144053 kernel: audit: type=1106 audit(1707511377.679:768): pid=10837 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.679000 audit[10837]: USER_END pid=10837 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:58.239349 kernel: audit: type=1104 audit(1707511377.679:769): pid=10837 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.679000 audit[10837]: CRED_DISP pid=10837 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:42:57.680000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-86.109.11.101:22-139.178.89.65:45020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:43:00.885852 env[1563]: time="2024-02-09T20:43:00.885748465Z" level=info msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\"" Feb 9 20:43:00.936917 env[1563]: time="2024-02-09T20:43:00.936789151Z" level=error msg="StopPodSandbox for \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\" failed" error="failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:43:00.937114 kubelet[2767]: E0209 20:43:00.937083 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e" Feb 9 20:43:00.937554 kubelet[2767]: E0209 20:43:00.937131 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e} Feb 9 20:43:00.937554 kubelet[2767]: E0209 20:43:00.937185 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:43:00.937554 kubelet[2767]: E0209 20:43:00.937232 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"65f64fea-8fa3-417d-9fee-7bbdf36de2c6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7e56d72b0b4cbd4aa9d95da725f83776f18f40f8dc1156b074aacdb3b09358e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-qgbqd" podUID=65f64fea-8fa3-417d-9fee-7bbdf36de2c6 Feb 9 20:43:02.686075 systemd[1]: Started sshd@63-86.109.11.101:22-139.178.89.65:45722.service. Feb 9 20:43:02.685000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-86.109.11.101:22-139.178.89.65:45722 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:43:02.713095 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:43:02.713162 kernel: audit: type=1130 audit(1707511382.685:771): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-86.109.11.101:22-139.178.89.65:45722 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:43:02.833000 audit[10892]: USER_ACCT pid=10892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:02.833591 sshd[10892]: Accepted publickey for core from 139.178.89.65 port 45722 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:43:02.834617 sshd[10892]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:43:02.836815 systemd-logind[1548]: New session 65 of user core. Feb 9 20:43:02.837270 systemd[1]: Started session-65.scope. Feb 9 20:43:02.834000 audit[10892]: CRED_ACQ pid=10892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.014911 kernel: audit: type=1101 audit(1707511382.833:772): pid=10892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.014962 kernel: audit: type=1103 audit(1707511382.834:773): pid=10892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.014986 kernel: audit: type=1006 audit(1707511382.834:774): pid=10892 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=65 res=1 Feb 9 20:43:02.834000 audit[10892]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff32a8eec0 a2=3 a3=0 items=0 ppid=1 pid=10892 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:43:03.074403 kernel: audit: type=1300 audit(1707511382.834:774): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff32a8eec0 a2=3 a3=0 items=0 ppid=1 pid=10892 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:43:03.105931 sshd[10892]: pam_unix(sshd:session): session closed for user core Feb 9 20:43:03.107253 systemd[1]: sshd@63-86.109.11.101:22-139.178.89.65:45722.service: Deactivated successfully. Feb 9 20:43:03.107924 systemd[1]: session-65.scope: Deactivated successfully. Feb 9 20:43:03.107938 systemd-logind[1548]: Session 65 logged out. Waiting for processes to exit. Feb 9 20:43:03.108392 systemd-logind[1548]: Removed session 65. Feb 9 20:43:02.834000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:43:03.195797 kernel: audit: type=1327 audit(1707511382.834:774): proctitle=737368643A20636F7265205B707269765D Feb 9 20:43:03.195831 kernel: audit: type=1105 audit(1707511382.838:775): pid=10892 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:02.838000 audit[10892]: USER_START pid=10892 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.289976 kernel: audit: type=1103 audit(1707511382.839:776): pid=10895 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:02.839000 audit[10895]: CRED_ACQ pid=10895 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.378905 kernel: audit: type=1106 audit(1707511383.106:777): pid=10892 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.106000 audit[10892]: USER_END pid=10892 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.474145 kernel: audit: type=1104 audit(1707511383.106:778): pid=10892 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.106000 audit[10892]: CRED_DISP pid=10892 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:03.107000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-86.109.11.101:22-139.178.89.65:45722 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:43:04.885457 env[1563]: time="2024-02-09T20:43:04.885291324Z" level=info msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\"" Feb 9 20:43:04.915190 env[1563]: time="2024-02-09T20:43:04.915123738Z" level=error msg="StopPodSandbox for \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\" failed" error="failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:43:04.915305 kubelet[2767]: E0209 20:43:04.915295 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1" Feb 9 20:43:04.915534 kubelet[2767]: E0209 20:43:04.915323 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1} Feb 9 20:43:04.915534 kubelet[2767]: E0209 20:43:04.915373 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:43:04.915534 kubelet[2767]: E0209 20:43:04.915406 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21aae8c4-8c7c-48d6-86a1-b78761bdb569\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4b9950feba576fda2ea89d81de6e99ec00c5a0d03841c135b856a6b170af94e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dx2ql" podUID=21aae8c4-8c7c-48d6-86a1-b78761bdb569 Feb 9 20:43:05.885467 env[1563]: time="2024-02-09T20:43:05.885328989Z" level=info msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\"" Feb 9 20:43:05.915040 env[1563]: time="2024-02-09T20:43:05.914933851Z" level=error msg="StopPodSandbox for \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\" failed" error="failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 9 20:43:05.915225 kubelet[2767]: E0209 20:43:05.915207 2767 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5" Feb 9 20:43:05.915260 kubelet[2767]: E0209 20:43:05.915242 2767 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5} Feb 9 20:43:05.915284 kubelet[2767]: E0209 20:43:05.915263 2767 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 9 20:43:05.915284 kubelet[2767]: E0209 20:43:05.915279 2767 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d4f3e13-ea2f-4678-87aa-ee971f79a1cb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4ad21478e0761e31a963b3465c60ba757f5430f2d7e1c397dd3fbbf3230213a5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68576cfd85-6rqtz" podUID=0d4f3e13-ea2f-4678-87aa-ee971f79a1cb Feb 9 20:43:08.112888 systemd[1]: Started sshd@64-86.109.11.101:22-139.178.89.65:50214.service. Feb 9 20:43:08.112000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-86.109.11.101:22-139.178.89.65:50214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:43:08.139995 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 9 20:43:08.140063 kernel: audit: type=1130 audit(1707511388.112:780): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-86.109.11.101:22-139.178.89.65:50214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 9 20:43:08.267000 audit[10978]: USER_ACCT pid=10978 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.268415 sshd[10978]: Accepted publickey for core from 139.178.89.65 port 50214 ssh2: RSA SHA256:ya3CuIx5HRXQ7ikfrirbGy0PeU2mVoIERJKJ2pM2LHs Feb 9 20:43:08.269613 sshd[10978]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 9 20:43:08.272030 systemd-logind[1548]: New session 66 of user core. Feb 9 20:43:08.272532 systemd[1]: Started session-66.scope. Feb 9 20:43:08.355837 sshd[10978]: pam_unix(sshd:session): session closed for user core Feb 9 20:43:08.357083 systemd[1]: sshd@64-86.109.11.101:22-139.178.89.65:50214.service: Deactivated successfully. Feb 9 20:43:08.357756 systemd[1]: session-66.scope: Deactivated successfully. Feb 9 20:43:08.357765 systemd-logind[1548]: Session 66 logged out. Waiting for processes to exit. Feb 9 20:43:08.358246 systemd-logind[1548]: Removed session 66. Feb 9 20:43:08.269000 audit[10978]: CRED_ACQ pid=10978 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.450829 kernel: audit: type=1101 audit(1707511388.267:781): pid=10978 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.450873 kernel: audit: type=1103 audit(1707511388.269:782): pid=10978 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.450894 kernel: audit: type=1006 audit(1707511388.269:783): pid=10978 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=66 res=1 Feb 9 20:43:08.509296 kernel: audit: type=1300 audit(1707511388.269:783): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0ab4ee50 a2=3 a3=0 items=0 ppid=1 pid=10978 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:43:08.269000 audit[10978]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0ab4ee50 a2=3 a3=0 items=0 ppid=1 pid=10978 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 9 20:43:08.601129 kernel: audit: type=1327 audit(1707511388.269:783): proctitle=737368643A20636F7265205B707269765D Feb 9 20:43:08.269000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 9 20:43:08.631468 kernel: audit: type=1105 audit(1707511388.274:784): pid=10978 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.274000 audit[10978]: USER_START pid=10978 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.725574 kernel: audit: type=1103 audit(1707511388.274:785): pid=10981 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.274000 audit[10981]: CRED_ACQ pid=10981 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.814480 kernel: audit: type=1106 audit(1707511388.356:786): pid=10978 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.356000 audit[10978]: USER_END pid=10978 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.909754 kernel: audit: type=1104 audit(1707511388.356:787): pid=10978 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.356000 audit[10978]: CRED_DISP pid=10978 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Feb 9 20:43:08.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-86.109.11.101:22-139.178.89.65:50214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'